Keith Packard

Keith Packard

keithp.com

Faces of Open Source

Home
Lists
Mail
Git
Blog

Projects hosted here

Nickle programming language
Ricochet Robots Game Server
Rocket Stuff
Notmuch mail
Calypso CalDAV/CardDAV server
Newlib for tiny embedded systems

Snek programming language

Fontconfig project
Cafepress store for freedesktop stuff

AltOS embedded operating system
ChaosKey random number source

Other stuff

Keith’s personal stuff
Keith’s Papers
Keith’s Blog
Some X project ideas
UPS

I support the Software Freedom Conservancy and encourage you to as well. Donate here
Last edited Tue Apr 30 15:06:33 2019

Torsten Enßlin Cosmologist, Astrophysicist, Scientist

Torsten Enßlin Cosmologist, Astrophysicist, Scientist

PD Dr. Torsten Enßlin

Cosmologist, Astrophysicist, Scientist

Head of the Information Field Theory Group

at the Max-Planck-Institute for Astrophysics

Highlights
Research
Bayes Forum
Lectures
Software
Data
About Me
Publications
Group
Meetings
Contact

Torsten Enßlin
Cosmologist, Astrophysicist, Scientist

I am a scientist at the Max-Planck-Institut für Astrophysik (MPA), Garching (near Munich), and lecturer at the Ludwig Maximilians University, Munich in Germany. I am interested in Information Theory, especially Information Field Theory (IFT), Cosmology and High Energy Astrophysics.
Recent Research Highlights
Next generation imaging

The Information Field Theory Group at the Max Planck Institute for Astrophysics has released a new version of the NIFTy software for scientific imaging. NIFTy5 generates an optimal imaging algorithm from the complex probability model of a measured signal. Such algorithms have already proven themselves in a number of astronomical applications and can now be used in other areas as well.

Learn more

Relics of the Big Bang

In the first fractions of a second after the birth of our universe, not only elementary particles and radiation, but also magnetic fields were generated. A team led by the Max Planck Institute for Astrophysics in Garching has now calculated what these magnetic fields should look like today in the universe – in great detail and in 3D.

Learn more

LOFAR radio observations document rejuvenation in space

In observations of galaxy clusters, astronomers in collaboration with the MPA discovered a new class of cosmic radio sources. With the digital radio telescope Low Frequency Array (LOFAR) they received the longest radio waves that can be measured on Earth. They identified a remarkable “tail”behind a galaxy in the radio light, which must have been re-energized after it had faded away. In the journal Science Advances, the team describes this discovery, which either confirms a theoretical prediction on the interaction between shock waves and radio plasma or represents a novel phenomenon.

Learn more

Wanted: the rotating radio emission of the Milky Way

The magnetic fields of the Milky Way cause electrons with nearly the speed of light to rotate and to emit radio waves. As consequence, this radiation should also “rotate” in some way, i.e. the polarization of the radiation will be circularly polarized. This very weak circular polarization of the Milky Way, however, has not been observed so far. Researchers at the Max Planck Institute for Astrophysics and colleagues have now predicted some properties of this polarization and created a “wanted poster” to allow targeted searches. A measurement of the circular polarization would provide important insights into the structure of the galactic magnetic fields and confirm that electrons – and not positrons – are the source of this radio emission in the Milky Way.

Learn more

Highlights Information Field Theory

Highlights Cosmology

Highlights High Energy Astrophysics

Information Field Theory

Information field theory (IFT) is information theory, the logic of reasoning under uncertainty, applied to fields. A field can be any quantity defined over some space, e.g. the air temperature over Europe, the magnetic field strength in the Milky Way, or the matter density in the Universe. IFT describes how data and knowledge can be used to infer field properties. Mathematically it is a statistical field theory and exploits many of the tools developed for such. Practically, it is a framework for signal processing and image reconstruction.

Learn more

Cosmology

The temperature fluctuations in the cosmic microwave background (CMB) and the cosmic matter distribution in the large-scale structure (LSS) are both tracers of the primordial quantum fluctuations. Those are believed to have happened during the very first moments of the Universe in the inflationary epoch. CMB and LSS are therefore our primary information sources on cosmology. Their detailed studies provide us insight into the history, geometry and composition of the Universe. IFT permits us to construct optimal methods to analyse and interpret CMB and LSS data, and to image with high fidelity the cosmic structures imprinted in those data sets.

Learn more

High Energy Astrophysics

The Universe is permeated by high-energy particles and magnetic fields. Charged particles with nearly the speed of light spiraling around in the magnetic fields, which themselves are bound to the cosmic plasma. The particles and fields are important ingredients of the interstellar and intergalactic media. They transport energy, they push and heat the thermal gas, and they trace violent processes in cosmic plasmas. A number of observational windows in basically all electromagnetic wavebands, ranging from the radio to the gamma ray regime, provide us with direct and indirect vision into the high energy Universe. The IFT group develops special purpose methods to better imagine relativistic particles, magnetic fields, and even to tomographically reconstruct their distributions within the Milkey Way.

Learn more

Lecture on Information Theory & Information Field Theory

Imaging in astronomy, geology and medicine require intelligent methods to obtain high fidelity images from noisy, incomplete data. The theoretical and mathematical framework in which imaging and data analysis methods are derived should be information theory to which these lectures will introduce first (first 1/3 semester, suited for Bachelor and Master students, 3 ETCS). Based on this, information theory for fields will be developed, which can be used to reconstruct signals from data (remaining 2/3 semester, more targeted at Master students, 6 ETCS).

Learn more

Seminar Information Theory & Information Field Theory

The seminar is intended for participants of the lecture on Information Theory (1/3 semester) & Information Field Theory (2/3 semester), the content of which will be assumed to be known by all participants. The main seminar goal is to extend the participants’ knowledge beyond the material covered in the lecture, especially with respect to concrete measurement situations, imaging, and existing algorithms. A second goal is to practice presentations and open discussions.

Learn more

Past & current Software Projects

Active software projects

Astronomical software projects

About Me

2019, Giuseppe and Vanna Cocconi of th EPS as part of the Planck Collaboration
2018, Hochsprung Award for my information field theory lecture that lead to a start-up by students
2018, Gruber prize for Cosmology as part of the Planck collaboration
since 2014, Head of the Information Field Theory Group at MPA
2014, Call for full Professorship on Theoretical Astroparticle Physics (W3 level, declined) Karlsruhe Institute for Technology, Germany
since 2014, Associate Professor (Privatdozent) at Ludwig-Maximilians-University Munich, Germany
since 2008, Planck Scientist status in the Planck Surveyor Mission (full access to proprietary data)
2003- 2016, Head of the MPA Planck Analysis Centre, Garching, Germany
since 2006, Tenured position at MPA
2003- 2006, Tenure-track-position at MPA
1999- 2003, Postdoctoral Researcher Max Planck Institute for Astrophysics (MPA), Garching, Germany
1999, Research Associate Physics Department of University of Toronto, Canada
1996- 1999, PhD “summa cum laude” on “Relativistic Particles and Magnetic Fields in Clusters and Filaments of Galaxies” Rheinische Friedrich-Wilhelms-Universität Bonn & MPI for Radioastronomy, Bonn, Germany

active Group

Philipp Arras (PhD Student); Radio Aperture Synthesis
Philipp Frank (PhD Student & former Master Student; Field dynamics inference and simulation)
Philipp Haim (Master Student; Medical imaging)
Johannes Harth-Kitzerow (Master Student; Data compression)
Sebastian Hutschenreuter (PhD Student & former Master Student; Primordial magnetism; Galactic strucures)
Fabian Kapfer (Master Student; Multi-frequency radio calibration)
Sebastian Kehl (Postdoc; Machine Learning)
Ivan Kostyuk (PhD Student; Cosmic simulations with deep convolutional neural networks)
Jakob Knollmüller (PhD Student & former Master Student; Bayesian component separation)
Reimar Leike (PhD Student & former Master Student; Information Field Dynamics)
Sara Milosevic (Master Student; Autoencoder)
Max Newrzella (Postdoc; Machine Learning)
Lukas Platz (Master Student; Spatio-spetral imaging of the Fermi gamma-ray sky)
Natalia Porqueres (PhD Students & former Master Student; Large-scale structure reconstruction)
Julian Rüstig (Master Student; Combined Inference of Single Dish and Interferometric Radio Data)
Ann-Kathrin Straub (Postdoc; Machine Learning)
Maxim Wandrowski (Master Student; Denoising, Deconvolving and Decomposing the COMPTEL Gamma-Ray Sky)
Margret Westerkamp (Master Student; Dynamical Field Inference by Ghost Fields)

Alumni

Christoph Lienhard (Master Student; Hamiltonian Monte-Carlo Sampling)
Max Kurthen (Master Student; Causal Inference)
Andreas Koch (Bachelor Student; Bayesian spectral and temporal feature inspection in magnetar giant flare SGR 1806-20)
Marvin Baumann (Bachelor Student; Bayesian multidimensional lightcurve reconstruction of the giant magnetar flare SGR 1806-20)
Tobias Aschenbrenner (Master Student; Adaptive Grids for NIFTy)
Fatos Gashi (Master Student; Stochastic Expectation Propagation in Information Field Theory)
Johannes Oberpriller (Master Student; Bayesian parameter estimation of miss-specified models)
Silvan Streit (Master Student; Fast representation of field covariances)
Martin Dupont (Master Student; Information field dynamics for cosmic rays)
Felix Wichmann (Master Student; Advanced aperture synthesis)
Matevz Sraml (Master Student; Gamma ray astronomy)
Theo Steininger (PhD Student; Galactic tomography)
Daniel Pumpe (PhD Student & former Master Student; Towards multifrequency imaging)
Vanessa Böhm (PhD Student; Gravitational lensing of the Cosmic Micowave Background)
Mahsa Ghaempanah (PhD Student; Information field theory for INTEGRAL gamma ray data)
Maximilian Kurthen (Bachelor Student; Discrete spherical harmonics)
Maksim Greiner (PhD Student; The Galactic free electron density — a Bayesian reconstruction Master Student; Signal Inference in Radio Astronomy)
Fotis Megas (Bachelor Student; Distinguishing Gravitational Wave Signals by Their Correlation Structures)
Sebastian Dorn (PhD Student; Bayesian Inference of Early-Universe Signals Master Student; Non-Gaussianity in the Cosmic Microwave Background)
Valentina Vacca (Postdoc; Radio astronomy)
David Butler (Master Student; Resolving polarised emission in radio interferometry)
Gasper Senk (Master Student; Detecting Cosmic Ray artifacts in astronomical images)
Marco Selig (PhD Student; Information field theory for gamma ray astronomy Master Student; Information field theory based high energy photon imaging)
Christian Muench (Master Student; Mathematical foundation of Information Field Dynamics)
Hendrik Junklewitz (PhD Student; Radio astronomy and information field theory)
Niels Oppermann (PhD Student; Signal inference in Galactic astrophysics)
Lars Winderling (Master Student; On the theory calibration)
Helin Weingartner (Master Student; Statistische Modellierung und Rekonstruktion von diffuser Roentgenstrahlung von Galaxienhaufen)
Maximilian Uhlig (Master Student; Cosmic ray driven Winds in Galaxies)
Maximilian Ullher (Bachelor Student; Eine Faradaykarte der Milchstrasse unter der Annahme approximativer Symmetrien)
Michael Bell (Postdoc; Radio Astronomy)
Jens Jasche (PhD Student; Bayesian Methods for analyzing the large scale structure of the Universe Master Student; On the coupling between cosmic rays and primordial gas)
Mona Frommert (PhD Student; Temperatur and Polarization of the Cosmic Microwave Background)
Cornelius Weig (Master Student; Information Field Theory applied to a spatially distorted log-normal field with Poissonian noise)
Petr Kuchar (Master Student; TCharacteristics of magnetic fields in galaxy clusters from Faraday rotation data REALMAF and its use on Hydra A)
Andre Walkens (Master Student; Studying magnetic turbulence with radio polarimetry)
Herbert Kaiser (Master Student; Cosmic Rays and primordial chemistry)
Francisco-Shu Kitaura (PhD Student; Cosmic Cartography Bayesian Reconstruction of the Cosmological Large-Scale Structure with ARGO, an Algorithm for the Reconstruction of Galaxy-traced Over-densities)
Gordana Stojceska (Master Student; Statistical Sampling in Multidimensional Parameter Spaces Algorithms and Applications)
Ilya Saverchenko (Master Student; Interacting Galaxies – Matching Simulations to Observations)
Christop Pfrommer (PhD Student; On the role of cosmic rays in clusters of galaxies)
Corina Vogt (PhD Student; Investigations of Faraday Rotation Maps of Extended Radio Sources in order to determine Cluster Magnetic Field Properties)

Meetings

Bayesian inference meets radio reality (July 15-20 2018)

Contact

Address
MPA
Karl-Schwarzschild-Str. 1
85748 Garching
Germany
Office: 010
Email
ensslin@mpa-garching.mpg.de
Phone
+49 (0) 89 30000 2243

Responsibility for content & ©: Torsten Enßlin. All rights reserved.
Privacy policy, Design: HTML5 UP

Joshua Stevens a visual communicator with a background in data science and cartography

Joshua Stevens a visual communicator with a background in data science and cartography

Joshua Stevens

Welcome! I am
Joshua Stevens
a visual communicator with a background in data science and cartography.

I combine journalism and design
to understand and communicate about our planet.

Latest post: Tutorial: Turning on The Lights
Highlights

One Day of Global Aerosols: Tropical cyclones, dust storms, and fires spread tiny particles throughout the atmosphere.

Probing Kilauea’s Plume: As fissures split the ground, sulfates billowed into the Hawaiian skies. Satellites reveal their path.

The Channeled Scablands: When a glacial dam broke during the last Ice Age, it changed the Columbian Plateau forever.

Signature of a Storm: Modeling the structure of hurricane Maria’s stormy clouds.

The Best Places to See the Eclipse: Analyzing seventeen years of cloud probability.

Sulfur Dioxide Spreads Over Iraq: After conflict, sulfur and carbon cloud the skies of Mosul.

Power Outages Plague Southeast: Satellites and power companies reveal Matthew’s impact.
My work has been featured by:

Home Blog About Work Contact

© 2019 Joshua Stevens

X Window System Basics

X Window System Basics

Explanations
Play, don’t show
⟨X Window System Basics⟩
Before we begin…

Since this article contains a lot of interactive demos relying on fairly modern browser technology, let’s make sure that everything is OK before continuing.

If you can see the stipple pattern above, that means that your browser is modern enough to see the interactive demos.

You might have noticed that when you ran your mouse over the stipple, your cursor changed. That’s because this isn’t just any old stipple image, that stipple is actually the background of a full X server session running in your browser using HTML5 canvas. All of the interactive demos will use this framework to explain what’s going on under the hood.
Basic Architecture

Although it may sound a bit stilted, notice how I keep saying “the X Window System” instead of the more traditional shorthands “X”, “X11”, or “Xorg”? I want to be very careful to separate the ideas and design of the system from its component parts.

The X Window System is a networked display system. A server component, the X server, is responsible for coordinating between all of the clients connected, taking input from the mouse and keyboard, and pushing pixels on the output. The most popular X server implementation is the Xorg X server, developed by the X.Org Foundation and community. There are other X server implementations: you might remember that Xorg was forked from XFree86 a decade ago, that Sun Microsystems has had several X server implementations, in both Xsun and XNeWS. Today, Xorg is the dominant X server implementation, getting most of the development. But back in the day, multiple competing implementations existed.

X servers and X clients all talk a standardized network protocol to each other, known as X11. This protocol is well-specified, from the wire format to the semantics of every request. The protocol documentation linked above is invaluable documentation for any hacker who wants to learn more about this stuff.

Applications and toolkits don’t write the wire format onto the socket directly, however. They often use client libraries that implement the protocols, like the traditional Xlib library, or the somewhat newer xcb.

I’m going to try and be precise in my nomenclature in this article.

When I talk about features or design decisions of the overall system, I will try to call it the X Window System, even if it sounds a bit verbose. e.g. “The X Window System provides us Pixmaps, which are images in the server’s memory.”

When I talk about features or details of the network protocol, I will talk about the X11 protocol. e.g. “The X11 protocol provides for a generic extension mechanism, which allows for a forward-compatible way to implement new features without having to redesign older parts of X11”.

When I talk about the behavior of a client or a server, I’ll say X client or X server e.g. “Using the MIT-SHM extension, X clients can pass memory buffers to the X server using POSIX shared memory, which prevents networking and large copies”.

When I talk about features or the architecture in the Xorg X server implementation, I’ll mention it explicitly as Xorg or the Xorg X server, e.g. “In order to make drawing calls accelerated, Xorg video drivers can provide hardware-accelerated versions of certain drawing primitives through EXA.”

If I ever say “such-and-such is a feature of X”, it’s a bug.
Requests and events

As said in the Introduction, X clients connect to an X server, and they speak an X11 protocol. In more detail, clients can send requests to ask the X server to do something. A simple example of a request is CreateWindow, which tells the X server to “create a window”. We’ll learn more about windows in a bit.

If something interesting happens inside the X server (for instance, a window was created), the X server can send X clients an event. To prevent network traffic from getting overloaded, X clients need to tell the X server which events they’re interested in. The network side of it is a bit complicated (it’s ugly, let’s not get into it), but programs using Xlib can tell the X server that they wants to listen to specific events using the XSelectInput function call.
Let’s go

Windows
Pixmaps
Root Window1
kitten2.png5
kitten1.png4
Inspector Button2
No window selected

Let’s start super simple. Here’s a simple X server with two windows in it. They don’t have any title bars, and you can’t drag them around because I haven’t launched a window manager yet. There’s just two windows, each showing each kitten.

You might notice that i button in the top right of the demo above. Click on it, and the inspector will pop open. On the left side of the inspector is a list of windows, and on the right side are the properties and attributes for the selected window.

This lets you dig into all the demos in this article in detail, showing how things are constructed. So, if you ever find yourself not quite understanding something I’m saying, playing around with the inspector can often help.

A window, in the X11 protocol, is a structure that allows an X client connecting to the X server to display something on the screen, and take input as well. Windows are fairly simple: they have an X, a Y, a width, and a height. This forms a rectangle which is known as the window’s bounding rectangle. The window occupies this space. Windows also have a defined stacking order, which means that some windows can be above other windows. If a window is higher in the stacking order, it occludes the windows below it.

For historical reasons related to some initial implementations, showing a window in the X11 protocol is called mapping a window, and hiding a window is called unmapping. Windows, when initially created, are unmapped (or hidden). Clients have to map windows by sending a MapWindow request to the X server, and clients can later unmap windows using UnmapWindow. Note that unmapping a window doesn’t destroy a window — doing so simply hides it. The window can then be mapped again later. It’s more like minimizing a window (in fact, that’s how minimization is implemented on most window managers).

So, we know what windows are. But how are those kittens getting on the screen?
Exposing historical baggage

In the late 80s, when the X Window System was designed, RAM was costly, and was a scarce resource. If we stored window contents in system memory, if you want to have a maximized window, that would be, well, 1 byte * (800 * 600 pixels) = almost half a megabyte! The user can’t open more than 10 maximized windows before exhausting the 5MB in his workstation, and with 16-bit True Color around the corner, we can’t fit more than 5! No, no, this can’t possibly scale.

So, if we can’t store window pixels in system memory, where can we store them? They have to exist somewhere, right?

Nope. The trick the X Window System authors realized is that the pixels for a window don’t have to exist at all. We only have one giant buffer of pixels for the entire screen, the front buffer, and windows borrow pixels to draw to.
Windows
Pixmaps
Root Window1
kitten1.png5
kitten2.png4
Inspector Button2
No window selected

The demo above shows two windows, with one window occluding the other. The window underneath moves from side to side, and you can see that when it moves, the window blanks out for a moment before redrawing itself.

The window on top, marked as kitten1.png in the inspector, owns a rectangle in the center of the screen. The window below, kitten2.png, owns a “L” shape slightly below and to the left.

When the X server needs pixels from a window, it tells the window to redraw the area it’s missing pixels for using an Expose event. The window then responds by submitting drawing commands back to the X server. The X server then processes all these drawing commands, touching pixels on the front buffer where the window is.

You can also drag kitten1.png around. Try it, and see if you can figure out how this behaves. Does it seem familiar? The authors of Windows chose the same design when they wrote their display server, however they called their equivalent to the Expose event WM_PAINT instead.
Windows of all shapes and sizes

Windows
Pixmaps
Root Window1
kittencircle.png5
kitten2.png4
Inspector Button2
No window selected

I said above that windows are rectangles. In the above demo, you see a circular window, so it’s quite obvious that I lied. You can still drag the top window around, but it might slow your browser down. Sorry, the math here is a lot more computationally expensive, especially in JavaScript.

Internally, the X server keeps a record of all the pixels that are currently visible for every window, containing the part of the window’s bounding rectangle that is currently showing on the screen. It’s calculated by taking the window’s overall bounding rectangle and then subtracting out the bounding rectangles of all the windows above it. It’s somewhat like a simple 1-bit alpha mask.

This data structure is called the clip list in the Xorg codebase. As the word “list” in the name might tell you, it’s not actually a 1-bit alpha mask. That would waste too much memory. Again, for a full-screen 800×600 window, you don’t want a giant alpha mask in the server’s memory telling it that it’s mostly visible, or mostly obscured. Instead, the X server stores a more compact version of the same thing, as a list of rectangles containing the areas where the window is visible. For a 800×600 window that’s not occluded, we now went from a 60kb bitmap mask to a rectangle containing four 32-bit numbers, for a total of 16 bytes. Quite a savings!

This data structure is seen throughout X11 programming, and it’s known as a region. If you’ve ever used the cairo graphics library, it has an implementation of regions, called cairo_region_t. (Actually, both the implementation in the Xorg codebase and the one in cairo are the same code, they’re both using the pixman library, underneath.)

However, like any data structure, it isn’t efficient in all use cases. A simple example here is a checkerboard pattern: instead of one bit per pixel, now we have 16 bytes per pixel! This is much more computationally expensive to work with, and takes a lot more memory. Thankfully, not a lot of software will use a checkerboard region.

Ahem, sorry. Enough reminiscing. Anyway, the story goes that when people working on the X server source code were doing anything with windows, they had code that looked like this:

void do_something_with_window(Window window)
{
Region region = new Region();
region.add_rectangle(window.bounding_rectangle);

region.intersect_rectangle(other_window.whatever);
draw_some_nonsense(region);
}

That is, the code was almost always taking the window’s bounding rectangle, and then converting it into a region to use elsewhere. So, they said to themselves, “Hey, why don’t we let the user set any arbitrary region that will be used instead of the bounding region?” And thus, the X SHAPE Extension was born. The X SHAPE Extension allows the user to instead tell the X server to use a bounding region instead of a bounding rectangle.

This is how we get a circular window, as above: we construct a region of a circle, and then set the window’s bounding region to be that circle. This is also how the classic xeyes and oclock get their classic cutout shapes.

The inspector will show the bounding shape region of a window in yellow.

There are two more notes I want to make here. Although this might seem like it would allow for some windows to be semitransparent by poking holes in them, it’s still all-or-nothing: either the window is fully transparent, or it’s not. That is, if you take your finger and point to any pixel on the above display, I can do some math and tell you the exact window that will paint to that pixel at any point in time. The X SHAPE Extension doesn’t change this, it just makes the logic for figuring out which window “owns” a specific pixel more complicated than testing against simple rectangles. In order to allow for true semitransparent windows, we’ll have to somehow figure out a way to blend between the pixels that windows draw. We’ll explore that another time.

Additionally, setting a bounding region that’s larger than the window’s width/height can’t actually make the window own some pixels that it wouldn’t own otherwise. The X SHAPE Extension only allows a window to carve away from where it would normally paint and give that space to the windows underneath.
Pixmaps

The more attentive of you playing around with that last demo might have noticed something special when poking around in kittencircle.png with the inspector. In the Attributes section of the inspector, you might have noticed a background-pixmap attribute, and hovering over it shows the circle kitten image! That raises a few questions: first of all, what is a Pixmap? Why didn’t the other windows have a background-pixmap attribute? They seemed to have a background-pixel attribute instead. What’s with that?

You might have been wondering how it could have been efficient to keep transfering missing pixels for the kitten images at 60 frames per second, over a network connection in the 80s. The answer is that we’re not. Instead, when we create the window and load the image in, the code creates a Pixmap, which allows us to have memory-backed pixel storage on the server. We then upload the pixels for the kitten image to the Pixmap once, using a PutImage request.

Whenever we want to draw to the window from an Expose event, we simply tell the X server to copy from the kitten pixels it already has in its memory space. To do that, we make a CopyArea from the Pixmap to the Window. No more copying done.

You might have noticed the word Drawable in the protocol documentation. A Drawable is something that the user can draw to, which is either a Pixmap or a Window. A Pixmap draws to its own memory storage, but a Window draws to the pixels on the front buffer which it owns.

“OK, so, then what’s this about background-pixel and background-pixmap?”

When the X server sends Expose events to a window, keep in mind that means that there are pixels “missing” from the front buffer that need to be redrawn. It needs to fill in the missing pixels with something, and the X server provides a window with three options:

It can fill the pixels with a color. This is what the background-pixel attribute specifies.
It can fill the pixels with the contents of a pixmap. This is what the background-pixmap attribute specifies.
It can do nothing, and simply leave whatever pixels were there before, and wait for the application to redraw. This is also the default, and it’s what you get when there’s no explicit background-pixel or background-pixmap attributes. This is how Windows works, and why you see the repeated “IE6 crashed” window when iexplore.exe hangs: it can’t respond to the WM_PAINT events, so the old pixel contents stay on the screen!

Coming up…

What is that mysterious “Root Window” we saw in the inspector? How do desktop environments set it up so that windows can be dragged and resized? Why do I have to use GtkEventBox in order to make my widgets respond to input?

All those questions, and more, will be answered… next time! In “Advanced Window Techniques”!
⟨Written by Jasper St. Pierre, among others⟩

Corentin Boissier

Corentin Boissier

1. Welcome 2. Biography 3. Compositions 4. To listen to my compositions 5. Photos 6. List of uploads CB/CB2 6bis. List of uploads CB3/CB4/CB5 7. Ideal Discotheque 8. Gallery of my 88 favorite Composers 9. The Internet Fake “Pandora Selfridge” discovered, part 1 9bis. The Internet Fake “Pandora Selfridge” discovered, part 2 10. Contact

24 years old (2019)

CORENTIN BOISSIER, neoromantic composer

Biography

Corentin Boissier has been composing with musical scores since he was 6 years old. Aged 9, he is discovered by the composer Thierry Escaich: « He already owns true qualities that will make him an accomplished musician. He already has a genuine sense of harmonic color, of rythmic invention and of renewal of thematic material. All these qualities show an open-minded spirit indicating a real gift for composition ».

He studied at the CRR of Paris in the Specialized Classes of Musical Writing and Orchestration where he got both Diplomas of Musical Studies (DEM) “With Highest Distinction”. In 2019 he obtained the Master of Superior Musical Writing at the National Superior Conservatory of Music (CNSM) of Paris, with four Prizes “With Highest Distinction”: Harmony, Counterpoint, Fugue & Forms, Polyphony; and seven Certificates (Orchestration, Arrangement, Analysis…) His thesis The Mini Piano Concerto from the years 40-60: a trend triggered by Richard Addinsell’s Warsaw Concerto got the congratulations of the jury.

Eager to write a directly accessible classical music, Corentin Boissier composed to date more than twenty works in a neo-romantic spirit. His ballade for alto saxophone and piano From Midnight to Dawn is premiered during the 2014 Musical Festival of Bagnac-sur-Célé by the duo Christine Marchais and Marc Sieffert. His encounter with the young several-award-winning pianist Philippe Hattat results in the performances of his Piano Sonata No. 1 « Romantica », his Double Toccata, his concert piece Solitude as well as three of his 24 Preludes to Travel. His three pieces for piano Romantic Young Ladies are recorded and uploaded on YouTube by the Italian concert pianist Annarita Santagada. His Glamour Concerto, version for solo piano, is recorded in 2016 in Quebec by the concert pianist Minna Re Shin. The Aria of Past Times is successively performed by the soprano Sayuri Araida, the baritone Aurélien Gasse, the harmonicist Claude Saubestre, the flutist Iris Daverio and the cellist Eric Tinkerhess, who also gives the world premiere of the Sonata for Cello and Piano with the composer at the piano.

Interested in different sides of musical writing, Corentin Boissier is also active as an orchestrator and arranger. Notably, his orchestration of the 9th of Alfredo Casella’s Nine Pieces for piano op. 24 is performed in concert by the Orchestra of the Gardiens de la Paix in 2016 at the Church « Saint-Joseph des Nations »; his orchestration of Debussy’s Passepied is performed in concert in the Auditorium Marcel Landowski in Paris; his arrangement of the jazz standard Caravan, written for the Local Brass Quintet, is performed live in the Musée de l’Orangerie, in Paris, in 2017. About his orchestration of Francis Poulenc’s Humoresque, the composer Nicolas Bacri wrote: « Congratulations for your orchestration. It’s very well rendered and perfectly in the style ».

In order to make available to a large audience little-known post-romantic works of the XXth and XXIst centuries, Corentin Boissier runs on YouTube the cultural musical channels collectionCB, collectionCB2, CB3, CB4 & CB5 (more than 2,500 uploads to date). He also published on the web his « Ideal Discotheque of more than 1,700 orchestral works of feelings »; the famous American music critic Walter Simmons agreed to be the dedicatee.

In February 2018 his Piano Sonata No. 2 « Appassionata » was premiered by concert pianist Célia Oneto Bensaid at the Salle Cortot, in Paris. A video recording, made in studio, has been uploaded on YouTube.

In March 2019, for an upcoming CD release, his two piano concertos (Glamour Concerto and Philip Marlowe Piano Concerto) were recorded by British concertist Valentina Seferinova and the Ukrainian Festival Orchestra under the direction of American conductor John McLaughlin Williams.

Biographie

Corentin Boissier compose sur partitions dès l’âge de six ans. À neuf ans, il est remarqué par le compositeur Thierry Escaich : « Il possède déjà de réelles qualités qui feront de lui un musicien complet. Il a déjà un véritable sens de la couleur harmonique, un sens de l’invention et surtout du renouvellement, une invention rythmique – bref, diverses ouvertures d’esprit qui laissent présager un réel tempérament de créateur ».

Après avoir suivi les cycles spécialisés d’Écriture et d’Orchestration au CRR de Paris où il obtient les deux Diplômes d’Etudes Musicales (DEM) avec “Mention Très Bien”, il obtient en 2019 le Master d’Écriture Supérieure au CNSM de Paris avec quatre Prix “Mention Très Bien” : Harmonie, Contrepoint, Fugue & Formes, Polyphonie ; et sept Certificats (Orchestration, Arrangement, Analyse…) Son Mémoire sur Le mini piano concerto des années 40-60 : une vogue déclenchée par le Warsaw Concerto de Richard Addinsell a reçu les félicitations du jury.

Soucieux d’écrire une musique classique directement accessible, Corentin Boissier a composé à ce jour une vingtaine d’œuvres dans un esprit néo-romantique. Sa ballade pour saxophone alto et piano De Minuit à l’Aube est créée aux Musicales de Bagnac-sur-Célé de 2014 par le duo Christine Marchais et Marc Sieffert. Sa rencontre avec le jeune pianiste multi-lauréat Philippe Hattat donne lieu aux créations de sa Sonate pour piano n°1 « Romantica », de sa Double Toccata, de sa pièce de concert Solitude ainsi que de trois de ses 24 Preludes to Travel. Ses trois pièces pour piano Romantic Young Ladies sont enregistrées et mises en ligne sur YouTube par la concertiste italienne Annarita Santagada. Son Glamour Concerto, version pour piano solo, est enregistré en 2016 au Québec par la concertiste Minna Re Shin. L’Aria of Past Times est interprétée successivement par la soprano Sayuri Araida, le baryton Aurélien Gasse, l’harmoniciste Claude Saubestre, la flûtiste Iris Daverio et le violoncelliste Eric Tinkerhess. Ce dernier crée en concert la Sonate pour violoncelle et piano avec l’auteur au piano.

Attiré par tous les aspects de l’Écriture musicale, Corentin Boissier consacre une part de son activité à l’orchestration et à l’arrangement. Entre autres, son orchestration de la 9ème des Neuf Pièces pour piano op. 24 d’Alfredo Casella est jouée par l’orchestre des Gardiens de la Paix en 2016 à l’Église Saint-Joseph des Nations ; son orchestration du Passepied de Debussy est donnée en concert à l’Auditorium Marcel Landowski à Paris ; son arrangement du standard jazz Caravan, pour le Local Brass Quintet, est joué au Musée de l’Orangerie, à Paris, en 2017. À propos de son orchestration de l’Humoresque de Poulenc, le compositeur Nicolas Bacri lui écrit : « Bravo pour votre orchestration. C’est très bien entendu et tout à fait dans le style. »

Désireux de rendre accessibles au plus grand nombre les belles œuvres peu connues des XXe et XXIe siècles, Corentin Boissier anime sur YouTube les chaînes culturelles musicales collectionCB, collectionCB2, CB3, CB4 & CB5 (plus de 2 500 mises en ligne à ce jour). Il a aussi publié sur internet sa « Discothèque idéale de plus de 1700 œuvres orchestrales de sentiments » dont le critique musical américain Walter Simmons a accepté d’être le dédicataire.

En février 2018, sa Sonate pour piano n°2 « Appassionata » a été créée à la Salle Cortot, à Paris, par la concertiste Célia Oneto Bensaid. Un enregistrement vidéo réalisé en studio a été mis en ligne sur YouTube.

En mars 2019, en vue d’une édition CD, ses deux concertos pour piano et orchestre (Glamour Concerto et Philip Marlowe Piano Concerto) ont été enregistrés par la concertiste britannique Valentina Seferinova et l’Ukrainian Festival Orchestra sous la direction du chef d’orchestre américain John McLaughlin Williams.

***

*

La Défense-Paris : I live here.

In English and in French

Stability of the solar system

Stability of the solar system

Scholarpedia is supported by Brain Corporation
Stability of the solar system
This article has not yet been published; it may contain inaccuracies, unapproved changes, or be unfinished.

Jacques Laskar, Astronomie et Systèmes Dynamiques, Paris, France

Dr. Jacques Laskar accepted the invitation on 25 September 2007 (self-imposed deadline: 31 October 2014).

This article will briefly cover: Historical and current aspects of the problem of the stability of the solar system

Contents [hide]

1 Introduction
2 Laplace-Lagrange stability of the Solar System
3 The problem of the eccentricities
4 Chaos in the Solar System
5 Evolution of planetary orbits.
6 Marginal stability of the Solar System.
7 Planetary collisions in the Solar System
8 Collisions of Mercury, Mars and Venus with the Earth
9 References

Introduction

The problem of the stability of the solar system has fascinated astronomers and mathematicians since antiquity, when it was observed that among the fixed stars, there were also wandering stars, the planets. Efforts were first focused on finding a regularity in the motion of these wanderers, so their movement among the fixed stars could be predicted. For Hipparchus and Ptolemy, the ideal model was a combination of uniform circular motions, the epicycles, which were continually adjusted over the centuries to conform to the observed course of the planets.

From 1609 to 1618, Kepler fixed the planets’ trajectories: having assimilated the lessons of Copernicus, he placed the Sun at the center of the universe and, based on the observations of Tycho Brahe, showed that the planets described ellipses around the Sun. At the end of a revolution, each planet found itself back where it started and so retraced the same ellipse. This vision of a perfectly stable solar system in which all orbits were periodic would not remain unchallenged for long.

In 1687 Newton announced the law of universal gravitation. By restricting this law to the interactions of planets with the Sun alone, one obtains Kepler’s phenomenology. But Newton’s law applies to all interactions: Jupiter is attracted by the Sun, as is Saturn, but Jupiter and Saturn also attract each other. There is no reason to assume that the planets’ orbits are fixed invariant ellipses, and Kepler’s beautiful regularity is destroyed.

In Newton’s view, the perturbations among the planets were strong enough to destroy the stability of the solar system, and divine intervention was required from time to time to restore planets’ orbits to their place. Moreover, Newton’s law did not yet enjoy its present status, and astronomers wondered if it was truly enough to account for the observed movements of bodies in the solar system.

The problem of solar system stability was a real one, since after Kepler, Halley was able to show, by analyzing the Chaldean observations transmitted by Ptolemy, that Saturn was moving away from the Sun while Jupiter was moving closer. By crudely extrapolating these observations,one finds that six million years ago Jupiter and Saturn were at the same distance from the Sun. In the 18th century, Laplace took up one of these observations, which he dated March 1st, 228 BC: At 4:23 am, mean Paris time, Saturn was observed two fingers under Gamma in Virgo.

The variations of planetary orbits were such that, in order to predict the planets’ positions in the sky, de LaLande was required to introduce artificial “secular” terms in his ephemeris tables. Could these terms be accounted for by Newton’s law?
Laplace-Lagrange stability of the Solar System
Figure 1: Elliptical elements. At any given time, a planet (J) can be considered to move on an elliptical orbit, with semimajor axis a and eccentricity e , with the sun at one focus (O). The orientation of this ellipse with respect to a fixed plane Π , and a direction of reference OX , is given by three angles: the inclination i , the longitude of the node Ω , the longitude of perihelion ϖ=Ω+ω , where ω is the argument of perihelion (P). The position of the planet on this ellipse is given by the mean longitude λ=M+ϖ , where M (mean anomaly) is an angle which is proportional to the area OPJ (third Kepler’s law).

The problem of these discrepancies between computations and observations remained open until the end of the 18th century, when Lagrange and Laplace correctly formulated the equations of motion (for a detailed historical account of this problem and more complete references, see (Laskar, 2013)). Lagrange started from the fact that the motion of a planet remains close, over a short duration, to a Keplerian ellipse, and so could use this ellipse as the basis for a coordinate system (Fig.1). Lagrange then wrote the differential equations that govern the variations in this elliptic motion under the effect of perturbations from other planets, thus inaugurating the methods of classical celestial mechanics. Laplace and Lagrange, whose work converged on this point, calculated secular variations, in other words long-term variations in the planets’ semi-major axes under the effects of perturbations by the other planets. Their calculations showed that, up to first order in the masses of the planets, these variations vanish (Poisson, Haretu and Poincaré later showed that this result remains true through second order in the masses of the planets, but not through third order).

This result seemed to contradict Ptolemy’s observations from antiquity, but by examining the periodic perturbations between Jupiter and Saturn, Laplace discovered a quasi-resonant term (2λJupiter−5λSaturn) in their longitudes. This term has an amplitude of 46′50′′ in Saturn’s longitude, and a period of about 900 years. This exlpains why observations taken in 228 BC and then in 1590 and 1650 could give the impression of a secular term.

Laplace then calculated many other periodic terms, and established a theory of motion for Jupiter and Saturn in very good agreement with 18th century observations. Above all, using the same theory, he was able to account for Ptolemy’s observations to within one minute of arc, without additional terms in his calculations. He thus showed that Newton’s law was in itself sufficient to explain the movement of the planets throughout known history, and this exploit no doubt partly accounted for Laplace’s determinism.

This result, where Laplace and Lagrange demonstrated that the planets’ semi-major axes undergo only small oscillations, and do not have secular terms was the first major result of Stability for the Solar System. At the same time, Laplace firmly established Newton’s law as the universal explanation for the motion of the celestial bodies.
The problem of the eccentricities

The stability of the semi-major axis of the planets is not sufficient to insure the stability of the Solar System. Indeed, if the eccentricity of the Earth becomes larger than 0.1, and the eccentricity of Mars becomes larger than 0.3, then collisions between these two planets can occur. The problem of the stability of the eccentricities and inclination of the planets was addressed by Laplace and Lagrange in an additional set of papers.

Taking into account only terms of first order in the perturbation series, they showed that the system of equations describing the mean motions of eccentricities and inclinations may be reduced to a system of linear differential equations with constant coefficients depending on the planetary masses and semi-major axes.

ddt⎡⎣⎢z1⋮zk⎤⎦⎥=−1−−−√[Ak0k0kBk]⎡⎣⎢z1⋮zk⎤⎦⎥

where for each planet j , zj=ejexp−1−−−√ϖj,ζj=sin(ij/2)exp−1−−−√Ωj , Ak and Bk are (k,k) matrices with real coefficients depending on the values of the planetary masses and semi-major axes. 0k is the (k,k) zero matrix, and (aj,ej,ij,λj,ϖj,Ωj) classical elliptical elements (Fig.1).

Using the conservation of angular momentum, Laplace demonstrated that, provided all the planets rotate around the Sun in the same direction, polynomial or exponential solutions cannot exist for this system. He concluded that the eigenvalues gi,si of Ak and Bk are real, and that the solution for this differential system are quasiperiodic expressions of the form zj=∑ki=1αij eigit ζj=∑ki=1βij eisit

where αij and βij are complex quantities. The frequencies gi,si are called the secular frequencies of the Solar System, and their values, as computed with a more complete model (Laskar, 1990, Laskar et al, 2004), are given in Table 1.
Table 1.Fundamental frequencies of the precession motion of the solar system. These values are taken as the mean values over 20 Myr for the inner planets and 50 Myr for the outer planets (laskar et al., 2004). For the inner planets, due to chaotic diffusion, the frequencies can change significantly with time (Laskar, 1990, Laskar et al., 2004). This is why less significant digits are given for the inner planets secular frequencies. perihelion frequencies (arcsec/yr) node frequencies (arcsec/yr)
g_1 5.59 s_1 -5.59
g_2 7.452 s_2 -7.05
g_3 17.368 s_3 -18.85
g_4 17.916 s_4 -17.755
g_5 4.257452 s_5 0
g_6 28.245 s_6 -26.347855
g_7 3.087951 s_7 -2.9925259
g_8 0.673021 s_8 -0.691736
g_9 -0.34994 s_9 -0.34998
Figure 2: The solutions of Laplace-Lagrange for the motion of the planets are combinations of circular and uniform motions with frequencies the precession frequencies gi and si of the solar system (Table 1). The eccentricity e3 of the Earth is given by OP , while the inclination of the Earth with respect to the invariant plane of the solar system (i3) is OQ (Laskar, 1992).

The inclinations and eccentricities of the orbits are therefore subject to only small variations about their mean values. But it must be stressed that Laplace’s solutions are very different from Kepler’s, because the orbits are no longer fixed. They are subject to a double precessional motion with periods ranging from 45,000 to several million years: precession of the perihelion, which is the slow rotation of the orbit in its plane, and precession of the nodes, which is the rotation of the plane of the orbit in space.

Later, Le Verrier, famed for his discovery in 1846 of the planet Neptune through calculations based on observations of irregularities in the movement of Uranus, took up Laplace and Lagrange’s calculations and considered the effects of higher order terms in the series (Le Verrier, 1840, 1841). He showed that these terms produced significant corrections and that Laplace’s and Lagrange’s calculations “could not be used for an indefinite length of time.” He then challenged future mathematicians to find exact solutions, without approximations. The difficulty posed by “small divisors” showed that the convergence of the series depended on initial conditions, and the proof of the stability of the solar system remained an open problem (see Laskar, 1992 for more details on this point).

Between 1892 and 1899 Poincaré formulated a negative response to Le Verrier’s question. In so doing he rethought the methods of celestial mechanics along the lines of Jacobi’s and Hamilton’s work. In his memoir On the three body problem and the equations of dynamics, Poincaré showed that it is not possible to integrate the equations of motion of three bodies subject to mutual interaction, and not possible to find an analytic solution representing the movement of the planets valid over an infinite time interval, since the series used by astronomers to calculate the movement of the planets were not convergent.

In the 1950s and 60s, the mathematicians Kolmogorov and Arnold, took up Poincaré’s work and showed that, for certain values of the initial conditions, it was nonetheless possible to obtain convergent series. If the masses, eccentricities, and inclinations of the planets are small enough, then many initial conditions lead to quasiperiodic planetary trajectories, similar to the Laplace-Lagrange solutions. But the actual masses of the planets are much too large for this result (known as the KAM theorem) to apply directly to the solar system and thereby prove its stability (In 1966, Michel Henon computed that the masses of the planets needed to be smaller than 10−320 of the solar mass in order to be able to apply Arnold’s theorem on the stability of planetary systems (see Laskar, 2014)).

Although the constants required for the application of Arnold’s theorem correspond to extremely small values of the planetary masses, this result reinforced one more time the idea that the Solar System was stable, by any reasonable acceptance of this term, on a time comparable to its age.

The results obtained through numerical integration in the past two decades will show the contrary.
Chaos in the Solar System

In the past decades, the problem of Solar System stability has advanced considerably, due largely to computers which allow extensive analytic calculations and numerical integrations over model time scales approaching the age of the solar system.

One part of these efforts consists of direct numerical integration of the equations of motion (Newton’s equations, sometimes with additional relativistic corrections or perturbations due to the Moon). Initial studies were limited to the motion of the outer planets, from Jupiter to Pluto. In fact, the more rapid the orbital movement of a planet, the more difficult it is to numerically integrate its motion. To integrate the orbit of Jupiter with a conventional integrator, a step-size of 40 days will suffice, while a step-size of 0.5 days is required to integrate the motion of the whole solar system (Cohen etal., 1973, Kinoshita and Nakai, 1984, Carpino etal., 1987). These studies, reaching 100 million years essentially confirmed the stability of the system, and the validity of the old perturbative approach of Laplace and Lagrange. At about the same time, calculations of the same system were carried out at MIT over even longer periods, corresponding to times of 210 and 875 million years. These calculations were carried out on “Orrery,” a vectorized computer specially designed for the task (Applegate etal., 1986, Sussman and Wisdom, 1988). This latter integration showed that the motion of Pluto is chaotic, showing exponential divergence with respect to the initial conditions, with a characteristic (Lyapunov) time of 20 Ma. But since the mass of Pluto is very small, (1/130,000,000 the mass of the Sun), this does not induce macroscopic instabilities in the rest of the solar system, which appeared relatively stable in these numerical studies.

The other possibility, in order to overcome some of the limitations of numerical integrations, consists in a semi-analytical approach. Using perturbation methods developed by Lagrange, Laplace and Le Verrier, (Laskar, 1989) derived an extended averaged system for the whole solar system except Pluto, including all contributions up to second order with respect to the masses, and through degree 5 in eccentricity and inclination. For the outer planets, some estimated corrections of third order were also included. The system of equations thus obtained comprises some 150,000 terms, and does not model the motion of the planets, but rather the averaged motion of their orbits. It thus can be integrated numerically on a computer using a very large step-size, on the order of 500 years. An integration over 200 million years showed that the solar system, and more particularly the system of inner planets (Mercury, Venus, Earth, and Mars), is chaotic, with a Lyapunov time of 5 million years (Laskar, 1989). An error of 15 m in the Earth’s initial position gives rise to an error of about 150 m after 10 Ma; but this same error grows to 150 million km after 100 Ma. It is thus possible to construct ephemerides over a 10 million year period, but it becomes essentially impossible to predict the motion of the planets with precision beyond 100 million years.

This chaotic behavior essentially originates in the presence of two secular resonances among the planets: θ=2(g4−g3)−(s4−s3), which is related to Mars and the Earth, and σ=(g1−g5)−(s1−s2), related to Mercury, Venus, and Jupiter (the gi are the secular frequencies related to the perihelions of the planets, while the si are the secular frequencies of the nodes) (Laskar, 1990). The two corresponding arguments change several times from libration to circulation over 200 million years, which is also a characteristic of chaotic behavior.

The improvement of computer speed and the development of new methods for numerical integration allowed to confirm most of these results by the direct integration of Newton’s equations (Fig. 3) (Quinn etal., 1991, Laskar etal., 1992b, Sussman and Wisdom, 1992).

Figure 3: The eccentricity of the Earth (a) and Mars (b) during a 6 Ma timespan centered at the present. The solid line is the numerical solution from Quinn etal. (1991), and the dotted line the integration of the secular equations (Laskar, 1990). For clarity, the difference between the two solutions is also plotted (from Laskar, etal., 1992).
Evolution of planetary orbits.

Over less than one million years, although the motion of the Solar System is chaotic, a quasi periodic model as the one of Laplace-Lagrange (fig. 2) gives a fair representation of the evolution of the planetary orbits. This linear model, although not very precise, provides in particular a good account of the variations of the eccentricity and inclination of the Earth which will be at the origin of the variation of the orientation of its axis of rotation, and thus of the insolation on its surface. Indeed, a similar model derived by Le Verrier (1840, 1841) was used by M. Milankovitch for the establishment of his astronomical theory of paleoclimates.

Over a longer period, of a few millions of years, a quasiperiodic approximation of the solution is still possible, but it should take into account the effect of the resonances between the secular motion of inner planets. It will thus not be possible to obtain it with the classical perturbative method of Le Verrier and its successors. On the other hand, such approximation can be obtained by some refined Fourier techniques, after the numerical integration of the averaged equations (Laskar, 1988, 1990).

The question of the maximum possible variations of the planetary orbits over the age of the Solar System becomes now even more difficult to answer, as because of the exponential divergence of the orbits, we know that it will not be possible to obtain precisely the orbital evolution of the Solar System after more than 100 Ma.

The computation of the evolution of the Solar system over 5 billion years, may thus appear illusory, but one does not seek here to predict the precise evolution of the system, but to look only for its possible behavior. With this intention, the integration of the orbits was even pushed over durations going well beyond the age of the Solar system (Laskar, 1994, 1995). The results (fig. 4) provide a very clear vision of the stability of the planetary orbits. In this figure is represented the computed evolution of the eccentricity of the orbits of planets of the Solar System over a duration of 25 billion years (from −10 to +15 billion years).

In fact, for better clarity, the plotted curve represents only the variation of the maximum eccentricity reached by the planetary orbits over intervals of 10 million years. Indeed, the oscillations of the eccentricity resulting from the linear coupling of the solutions (fig. 3) are removed by this procedure. In doing so, the only variations which appear in figure 4 are thus the variations due to the chaotic diffusion of the orbits.

For all external planets, the maximum eccentricity is almost constant. That reflects the fact that these trajectories are very close to regular and quasiperiodic trajectories; possible instabilities are insensitive with the scale of the drawing.

For Venus and the Earth, one observes moderated variations, but still significant. The maximum eccentricity of the Earth reached through chaotic diffusion reaches about 0.08, whereas its current variations are approximately 0.06. It is about the same for Venus.

The two curves of the maximum eccentricity of the Earth and Venus are very similar, because of the linear coupling between these two planets. The evolutions of the orbits of Mars and Mercury are very spectacular. The diffusion of the eccentricity of Mars can bring this one to 0.2 in a few billion years, whereas the variations of the Mercury orbit can lead its eccentricity to values exceeding 0.5.

Figure 4: Numerical integration of the averaged equations of motion of the solar system 10 Ga backward and 15 Ga forward. For each planet, the maximum value obtained over intervals of 10 Ma for the eccentricity is plotted versus time. For clarity of the figures, Mercury, Venus and the Earth are plotted separately from Mars, Jupiter, Saturn, Uranus and Neptune. The large planets behavior is so regular that all the curves of maximum eccentricity appear as straight lines. On the contrary the corresponding curves of the inner planets show very large and irregular variations, which attest to their diffusion in the chaotic zone.(Laskar, 1994)

In fact, the system is still constrained by the angular momentum conservation, which constrains strongly the most massive planets, and it is remarkable to note that in the system of interior planets, the less one planet is massive, the larger is the possible diffusion of its orbit. The behaviors of the inclinations are very similar to those of the eccentricities.

Because of the chaotic character of the orbits, a very small modification of the initial conditions will lead to a solution different from the preceding one after a few hundreds of million years, but the general aspect of the solutions will undoubtedly remain the same one. To evaluate which are the possible variations maximum for the orbits of planets on 5 billion years, one can then seek, by very small modifications of the initial conditions, the trajectory which leads to the strongest variations of the orbits.

More systematically, Laskar (1994) calculated 5 trajectories of very close initial conditions over 500 Ma. The trajectory leading to the strongest Mercury eccentricity is then retained, and in the vicinity of this maximum, 5 new trajectories are computed again for a new duration of 500 Ma. For Mercury, this method then makes it possible to obtain in tens of such stages an orbit which comes to cut Venus’ orbit in less than 3.5 billion years. Let us note that, brought back to the initial position, because of the exponential divergence of the orbits, the displacements of initial conditions correspond to a displacement of the position of the Earth smaller than Planck’s length (≈10−33cm).

It should however be noted that to arrive at this possible collision between Mercury and Venus, the model was used beyond its rigorous field of validity, which does not includes the vicinity of collisions. In addition, the solution was carefully chosen, so in any case, it is surely not a very probable one, and the majority of the solutions of close initial conditions will not lead to this possible collision.

The same method was applied to all other planets, but the chaotic diffusion of their orbits did not allowed for a collision in less than 5 billion years. The planet which, with Mercury, has the most unstable orbit is the planet Mars, whose eccentricity can, by this same method reached approximately 0.25 in less than 5 billion of years, while the Earth’s eccentricity barely reached 0.1.
Marginal stability of the Solar System.
Figure 5: Estimates of the zones possibly occupied by the inner planets of the solar system over 5 Ga. The circular orbits correspond to the bold lines, and the zones visited by each planet resulting from the possible increase of eccentricity are the shaded zones. In the case of Mercury and Venus, these shaded zones overlap. Mars can go as far as 1.9 AU, which roughly corresponds to the inner limit of the asteroid belt (Laskar, 1995).

If one summarizes the results obtained with these integrations of the secular equations on a plane graph representing the zone swept by the planetary orbits for the maximum values of their eccentricity, (fig. 5), one notes that the Solar system interns is “full”: there is no place for an additional body. One needs at least 3.5 billion years to allow a collision between Mercury and Venus, but an additional body placed in this system will probably collide more rapidly with one of already existing planets.

This observation leads then to the concept of marginal stability for the Solar system: the Solar system is unstable, but catastrophic phenomena leading to the destruction of the System in its current form can take place only in a time comparable with its age, that is to say approximately 5 billion years. The observation of this present state then makes it possible to suppose that it always was thus for the Solar system, since the end of its formation. At that time, it could have remained some other bodies than the current planets, but in this case, the System would have been much more unstable, and a collision or an ejection could have taken place (an example could be the impactor of the Earth which was at the origin of the formation of the Moon). After this event, the remaining System becomes much more stable. We thus obtain a self-organization of the System towards increasingly stable states which are always states of marginal stability.

This vision is in agreement with the models of formation of planets by accretion of planetesimals (Safronov, 1969), because it shows how the residual bodies could disappear, in particular in the internal Solar system. It is remarkable that the zone swept by the orbit of Mars to its maximum eccentricity reaches the limits of the belt of asteroids.

Concerning the system of the outer planets, the things are appreciably different, because the direct gravitational short period perturbations are more significant. The recent numerical simulations show that particles placed among the outer planets do not remain beyond a few hundreds of million years, apart for some particular zones of stability or beyond Neptune, in the Kuiper belt, where objects explicitly were found.

Finally, these observations also make it possible to have an idea of the general aspect of a planetary system around a star. Indeed, if the process of formation planetary from planetesimals is correct, it becomes possible that the planetary systems will always be in a state of marginal stability, like our own Solar system. At the end of the phase of formation of the system, a large number of bodies can remain, but in this case the system is strongly unstable, which will led to a collision or an ejection. After this event, the system becomes more stable, with constantly, a time of stability comparable with its age.
Planetary collisions in the Solar System

The approach of (Laskar, 1994) allowed extremely fast computations but had some limitations, because the approximation obtained by the averaged equations decreases in accuracy as one approaches the collision. A study using the complete, non-averaged equations was therefore necessary to confirm these results. Actually, because of the chaotic nature of the solutions, A single trajectory will not provide the evolution of the Solar system over more than a few tens of Myr, and a statistical analysis, over an ensemble of solutions is required.
Figure 6: Evolution of the eccentricity of Mercury over 5 Gyr. (a) 201 solutions with close initial conditions, with a numerical integration that does not include General Relativity. 121 solutions out of 201 lead to a very high increase of Mercury’s eccentricity. (b) 2501 solutions for the full system, including general relativity. Only 21 solutions lead to a high increase of Mercury’s eccentricity.(Adapted from Laskar & Gastineau, 2009)

Indeed, before 2009, no direct integration of a single trajectory of the Solar System had yet been published using a realistic model, including the effect of the Moon and general relativity. To approach this problem, (Laskar, 2008) has carried out a statistical study, using the averaged equations (whose numerical integration is about 1000 times faster than for the full equations), for 1000 different solutions that were integrated over 5 Gyr. This study showed that the probability to reach very high eccentricities for Mercury (> 0.6) is on the order of 1%. When neglecting the contribution of general relativity, the same system of averaged equations, gave very surprising results, as in this case, more than half of the trajectories led to an increase of the eccentricity of more than 0.9 in less than 5 Gyr. These results were confirmed by a direct (non averaged) numerical integration, using a symplectic integrator (Laskar et al., 2004), of a pure Newtonian planetary model, for 10 trajectories with close initial conditions. The result was consistent with the results of the secular system since 4 trajectories out of 10 led to eccentricity values for Mercury larger than 0.9 (Laskar, 2008). This large excursion of the eccentricity of Mercury is explained by the presence of a resonance between the perihelion of Mercury and Jupiter, which is made easier in the absence of general relativity (GR). It is known that GR increases the precession speed of the perihelion of mercury by 0.43′′/𝑦𝑟. This moves it from 5.15′′/𝑦𝑟 to 5.58′′/𝑦𝑟, and thus send it further from the value of the perihelion speed of Jupiter (4.25′′/𝑦𝑟”). Independently, Batygin and Laughlin (2008) published similar results. They resumed the calculation of (Laskar, 1994) on a system of non-relativistic equations, and also demonstrated the possibility of collisions between Mercury and Venus.

These results were still incomplete. Indeed, as the relativistic system is much more stable than the non-relativistic system, it is much more difficult to exhibit an orbit of collision between Mercury and Venus in the realistic (relativistic) system than in the non-relativistic system taken into consideration in these two previous numerical studies. The real challenge was therefore in the estimation of the probability of collision of Mercury and Venus for a realistic, relativistic, non averaged model.
Collisions of Mercury, Mars and Venus with the Earth

In order to confirm the results obtained in 1994 and 2008 with the averaged equations, Laskar and Gastineau (2009) have then undertaken a massive computation of orbital solutions for the Solar System motion, using a non-averaged model consistent with the short-term highly accurate INPOP planetary ephemeris that we had developed in the past years (Fienga et al., 2008). Thanks to the installation of the JADE supercomputer at CINES, near Montpellier, with more than 12000 cores, Laskar and Gastineau could benefit of a large amount of computer time during the testing period of this machine. They started the computations as soon as the machine was switched on, in early August 2008, using 2501 cores, with one trajectory being computed on each core. The computations were finalised in about 6 months, totalling about 7 million hours of single node CPU time.
Figure 7: Example of long-term evolution of the planetary orbits: Mercury (white), Venus (green), Earth (blue), Mars (red). Time is indicated in thousands of years (kyr). (a) In the vicinity of the current state, the orbits become distorted under the influence of planetary perturbations, but without allowing close encounters or collisions. (b) In about 1% of cases, the orbit of Mercury may be distorted enough to allow a collision with Venus or the Sun in less than 5 Gyr. (c) In one of the trajectories, the eccentricity of Mars increases sufficiently to allow for a close encounter or collision with Earth. (d) This leads to a destabilisation of the terrestrial planets that also allows a collision between Venus and Earth. Adapted from (Laskar and Gastineau, 2009)

With the JADE machine, they were able to simulate 2501 different solutions of the movement of the planets of the whole Solar System on 5 billion years, corresponding to the life expectancy of the system, before the Sun becomes a red giant. The 2501 computed solutions are all compatible with our current knowledge of the Solar System. They should thus be considered as equiprobable outcomes of the future of the Solar System. In most of the solutions, the trajectories continue to evolve as in the current few millions of years: the planetary orbits are deformed and precess under the influence of the mutual perturbations of the planets but without the possibility of collisions or ejections of planets outside the Solar System. Nevertheless, as predicted by the secular equations, in 1% of the cases, the eccentricity of Mercury increases considerably. In many cases, this deformation of the orbit of Mercury then leads to a collision with Venus, or with the Sun in less than 5 Ga, while the orbit of the Earth remained little affected. However, for one of these orbits, the increase in the eccentricity of Mercury is followed by an increase in the eccentricity of Mars, and a complete internal destabilisation of the inner Solar System (Mercury, Venus, Earth, Mars) in about 3.4 Gyr. Out of 201 additional cases studied in the vicinity of this destabilisation at about 3.4 Gyr, 5 ended by an ejection of Mars out of the Solar System. Others lead to collisions between the planets, or between a planet and the Sun in less than 100 million years. One case resulted in a collision between Mercury and Earth, 29 cases in a collision between Mars and the Earth and 18 in a collision between Venus and the Earth (Laskar and Gastineau, 2009).

Beyond this spectacular aspect, these results also validated the methods of semi-analytical averaging developed for more than 20 years and which had allowed to show the possibility of collision between Mercury and Venus (Laskar, 1994). These results also answer to the question raised more than 300 years ago by Newton, by showing that collisions among planets or ejections are actually possible within the life expectancy of the Sun, that is, in less than 5 Gyr. The main surprise that comes from the numerical simulations of the recent years is that the probability for this catastrophic events to occur is relatively high, of the order of 1%, and thus not just a mathematical curiosity with extremely low probability values. At the same time, 99% of the trajectories will behave in a similar way as in the recent past millions of years, which is coherent with our common understanding that the Solar System has not much evolved in the past 4 Gyr. What is more surprising is that if we consider a pure Newtonian world, starting with the present initial conditions, the probability of collisions within 5 Gyr grows to 60%, which can thus be considered as an additional indirect confirmation of general relativity.
Figure 8: Artist view of a Venus-Earth collision.(J. Vidal-Madjar), (c) IMCCE-CNRS.
References

Applegate, J.H., Douglas, M.R., Gursel, Y., Sussman, G.J. and Wisdom, J.: 1986, The Solar System for 200 million years. Astron. J., 92, 176–194

Arnold, V.: 1963a, Proof of Kolmogorov’s theorem on the preservation of quasi-periodic motions under small perturbations of the Hamiltonian. Rus. Math. Surv., 18, N6, 9–36

Arnold, V.I.: 1963b, Small denominators and problems of stability of motion in classical celestial mechanics. Russian Math. Surveys, 18, 6, 85–193

Batygin, K., Laughlin, G.: 2008, On the Dynamical Stability of the Solar System. ApJ, 683, 1207–1216

Carpino, M., Milani, A. and Nobili, A.M.: 1987, Long-term numerical integrations and synthetic theories for the motion of the outer planets. Astron. Astrophys., 181, 182–194

Cohen, C.J., Hubbard, E.C., Oesterwinter, C.: 1973, Astron. Papers Am. Ephemeris XXII,1

Fienga, A., Manche, H., Laskar, J., Gastineau, M.: 2008, INPOP06. A new numerical planetary ephemeris, A&A, 477, 315-327

Kinoshita, H., Nakai, H.: 1984, Motions of the perihelion of Neptune and Pluto. Cel. Mech., 34, 203

Kolmogorov, A.N.: 1954, On the conservation of conditionally periodic motions under small perturbation of the Hamiltonian. Dokl. Akad. Nauk. SSSR, 98, 469

Laskar, J. : 1988, Secular evolution of the Solar System over 10 million years, Astron. Astrophys., 198, 341-362

Laskar, J.: 1989, A numerical experiment on the chaotic behavior of the Solar System Nature, 338, 237-238

Laskar, J.: 1990, The chaotic motion of the solar system. A numerical estimate of the size of the chaotic zones, Icarus, 88, 266-291

Laskar, J.: 1992a, La stabilité du Système Solaire, in Chaos et Déteminisme, A. Dahan etal, eds., Seuil, Paris, partially translated and reprinted as: Laskar, J.: 1995 The Stability of the Solar System from Laplace to the Present, in General History of Astronomy, R. Taton et Curtis Wilson eds.,vol. 2B, pp. 240-248

Laskar, J., Quinn, T., Tremaine, S.: 1992b, Confirmation of Resonant Structure in the Solar System. Icarus, 95, 148–152

Laskar, J.: 1994, Large scale chaos in the Solar System. Astron. Astrophys., 287, L9–L12

Laskar, J., 1995. Large scale chaos and Marginal stability of the solar system, the XIth ICMP meeting (Paris July 1994), International Press, pp. 75-120. also in Celestial Mechanics, 64, 115-162

Laskar, J., Robutel, P., Joutel, F., Gastineau, M., Correia, A.C.M., Levrard, B.: 2004, A long-term numerical solution for the insolation quantities of the Earth, Astron. Astrophys., 428, pp. 261-285

Laskar, J. : 2008: Chaotic diffusion in the Solar System, Icarus, 196, 1-15

Laskar, J., Gastineau, M. : 2009: Existence of collisional trajectories of Mercury, Mars and Venus with the Earth, Nature, 459, 817-819

Laskar, J., 2013, Is the Solar System stable ?, Progress in Mathematical Physics, 66, 2013, pp 239-270

Laskar, J. 2014, Michel Henon and the Stability of the Solar System, http://arxiv.org/abs/1411.4930

LeVerrier U.J.J.: 1840, Mémoire sur les variations séculaires des éléments des orbites pour les sept planètes principales, Mercure, Vénus, la Terre, Mars, Jupiter, Saturne et Uranus. Presented at the Academy of Sciences on September 16, 1839, Additions à la Connaissance des temps pour l’an 1843 Paris, Bachelier, pp. 3–66

LeVerrier U.J.J.: 1841, Mémoire sur les inégalités séculaires des planètes. Presented at the Academy of Sciences on December 14, 1840, Additions à la Connaissance des temps pour l’an 1844 Paris, Bachelier, pp. 28–110

Quinn, T.R., Tremaine, S., Duncan, M.: 1991, ‘A three million year integration of the Earth’s orbit,’ Astron. J. 101, 2287-2305

Safronov, V.S.: 1969, 1972, Evolution of the protoplanetary cloud and formation of the earth and planets., by Safronov, V. S.. Translated from Russian. Jerusalem (Israel) Keter Publishing House, 212 p.

Sussman, G.J., and Wisdom, J.: 1988, ‘Numerical evidence that the motion of Pluto is chaotic.’ Science 241, 433-437

Sussman, G.J., and Wisdom, J.: 1992, ‘Chaotic evolution of the solar system’, Science 257, 56-62
Sponsored by: Prof. Alessandra Celletti, Dipartimento di Matematica, Universita’ di Roma (Tor Vergata), Italy
Categories:

Celestial mechanicsAstrophysicsPlanetary PhysicsDynamical Systems

Log in / create account

Page
Discussion

Read
View source
View history

Main page
About
Propose a new article
Instructions for Authors
Random article
FAQs
Help
Blog

Focal areas

Astrophysics
Celestial mechanics
Computational neuroscience
Computational intelligence
Dynamical systems
Physics
Touch
More topics

Activity
Tools

Powered by MediaWiki Powered by MathJax

This page was last modified on 26 June 2015, at 16:41.
This page has been accessed 50,966 times.
Because this page is a work-in-progress, no reuse is permitted except by permission of the author(s).

Privacy policy
About Scholarpedia
Disclaimers

Permanent Record by Edward Snowden

Permanent Record by Edward Snowden

× This site uses cookies. By continuing to browse this site you are agreeing to our use of cookies. (More Information)
Log In Register for Online Access

London Review of Books

Latest
Archive
Bookshop
Contact Us
About
Store
Subscribe

Introduction
Back Issues
Contributors
Categories
Letters
Audio
Video

LRB Cover

Christian Lorentzen lives in Brooklyn.
MORE BY THIS CONTRIBUTOR

The Collage Police
Ali Smith
Driving through a Postcard
In New Hampshire
Sessions with a Poker
Sessions with a Poker
I had no imagination
Gerald Murnane
Dad & Jr
Bushes Jr & Sr
God wielded the buzzer
The Sorrows of DFW
The way out of a room is not through the door
Charles Manson

London Review Bookshop
Upcoming Events
Jonathan Coe presents Britannia Hospital
17 September at 7 p.m.
BOOK TICKETS
Rough Ideas: Stephen Hough and James Jolly
24 September at 7 p.m.
BOOK TICKETS
View all Upcoming Events

Vol. 41 No. 18 · 26 September 2019
pages 23-26 | 5192 words

facebook twitter emailletterciteprint

Tlarger | smaller
Dial Up, Log On
Christian Lorentzen

BuyPermanent Record by Edward Snowden
Macmillan, 339 pp, £20.00, September, ISBN 978 1 5290 3565 0

You have 3 free articles left this month.
Celebrate 40 years of the LRB by subscribing at our special rate of £40 and get unlimited access to our complete archive
Subscribe Now
Already a subscriber? Log in

Edward Snowden was born in the summer of 1983. Around this time, the US Defence Department split its computer network into MILNET, an internal military branch, and a public branch, which we now know as the internet. Home computers were becoming pervasive; the Commodore 64 was selling in the millions. One day Snowden’s father brought one home, connected it to the TV set, and the toddler Eddie noticed that his father was now controlling what was happening on the screen. The boy sat on his father’s lap and watched him pilot a helicopter in the living room: he was playing the flight simulator game Choplifter! Six-year-old Eddie received a Nintendo Entertainment System for Christmas in 1989, and his ‘real education’ began. He learned about the ‘invisible wall’, the rule that prevents a player of Super Mario Bros from going backwards in a game that moves, like words on a page, strictly from left to right. There is no reversing time. The leaker of government documents can’t go home again.

Everyone grows up with computers now, and many people were born in 1983 – e.g. Amy Winehouse and Kim Jong-un – but, however atypical Snowden was in other ways, this timing was crucial. A little older and he wouldn’t have encountered the internet until adulthood; a little younger and the internet he found would have been the corporate version we have now. His family got its first modem in 1992, when connections were slow, internet users pseudonymous, and the rooms where they chatted self-regulating and unmonitored. It was a zone of freedom and forgiveness where identities could be picked up and discarded without consequence. Say something silly or stupid and you could simply change your handle and join the other chatroom denizens in mocking your former self. Snowden calls the efforts of governments and businesses to link online personas to legal identities – efforts supported by the rise of Facebook, and its insistence that users go by their real names – ‘the greatest iniquity in digital history’.

1983 also saw the release of WarGames, the film that turned the figure of the teenage hacker, played by Matthew Broderick, into a hero. Broderick’s character starts by using his modem to tweak his high school grades; soon he is infiltrating government computer systems and averting nuclear disaster. The film further disseminated the already widespread idea that computer networks had grown too complex and powerful for the government to control; Reagan watched it, and it’s said to have led to the first presidential directive on cybersecurity. That directive – which charged the National Security Agency with the task of monitoring and protecting information transmitted by US military, business and personal computers – was soon overridden by Democrats in Congress: the NSA had been founded, in 1952, to intercept foreign communications and was officially prohibited from spying on Americans. This muddling of its mission would become part of the story of Snowden’s life.

Snowden’s significant hack, as a teenager, was of the Los Alamos National Laboratory. Alarmed by an article he had read about the history of the US nuclear weapons programme, he went to the lab’s website – and was excited to see that it had an open directory structure. He didn’t find instructions on how to build a nuclear bomb, which were anyhow available elsewhere on the web, but he did find internal memos and information on personnel not meant for public consumption. He called the lab and left a voicemail message informing the authorities of the lab’s vulnerability. Weeks went by and he kept checking the site to see if the hole had been plugged. Then one afternoon the phone rang; his mother answered and the colour left her face. ‘What did you do?’ she asked her son, and handed him the receiver. An IT worker from Los Alamos was on the line. He thanked ‘Mr Snowden’ for his efforts, asked if he was interested in a job, and told him to get in touch when he turned 18. His mother didn’t punish him.

The family was wrapped in the Stars and Stripes. Snowden’s father, Lonnie, was an electronics instructor for the Coast Guard, and the boy’s first nine years were spent in the port town of Elizabeth City, North Carolina. His mother, Wendy, was descended from two passengers on the Mayflower: Priscilla Mullins, ‘the only single woman of marriageable age in the whole first generation of the Plymouth Colony’, and the ship’s cooper, John Alden, whom she chose over the colony’s military supremo, Commander Myles Standish – an episode that became the subject of a poem by Longfellow. The maternal line included heroes of the War of Independence and cousins who fought on opposing sides in the Civil War, all the way down to Wendy’s father, a Coast Guard admiral. The Snowdens were 17th-century Quaker settlers in Maryland and the name was still remembered on street signs when the family moved there in 1992: Lon had been transferred to a new post, and Wendy took a job at the NSA, where she administered pension benefits. Both parents had top security clearances. Most of their neighbours worked for the government or the military and at barbecues nobody talked about their job. Normality was cover.

Permanent Record seems to have been written reluctantly: a memoir by a celebrity dissident dedicated to the cause of digital privacy. There were surely market incentives for the book to take the form it has: publishers prefer personal revelations to manifestos. But for all the storytelling it is a manifesto all the same. The innocent boy grows up in a digital paradise that becomes a fallen world when government and capital learn how to control it. Snowden’s reticence about himself dates to high school. Towards the end of his freshman year his English class was given the assignment: ‘Please produce an autobiographical statement of no fewer than a thousand words.’

I was being ordered by strangers to divulge my thoughts on perhaps the only subject on which I didn’t have any thoughts: the subject of me, whoever he was. I just couldn’t do it. I was blocked. I didn’t turn anything in and received an Incomplete.

The boy’s parents were getting divorced. Home was now a place of secrets, lies and strife. Soon enough the house would be sold, and the boy would move into a condominium with his mother and sister. He was confused and sullen and blamed himself. He stopped answering to ‘Eddie’ and became ‘Ed’. He stopped describing what he was doing at the computer as ‘playing’ and started to call it ‘working’. This was where he believed his real life was taking place, a secret life lived on bulletin boards, where he chatted and messaged anonymously with adult strangers who advised him on such questions as how to build his own computer or how to make this chipset work with that motherboard. Although he doesn’t go into it, all this must have marked him as a ‘geek’ or a ‘nerd’ among his peers at a time when those words still carried a stigma among children. The mixture of irony and smugness that ‘geek’ and ‘nerd’ now convey is a recent phenomenon, the consequence of their having been reappropriated by tech workers, fans of comic book movies and other yuppies who are emboldened by the millions or billions of dollars that the most successful nerds bring in. While his classmates were agonising over who was going to make the team or who had the right sneakers, Snowden was daydreaming about getting back to his Compaq, dialling up and logging on. Online was a higher plane, even if much of it was gaming. When he could get away with it, he stayed up all night in front of the screen.

He had little taste for school, ‘an illegitimate system’ that ‘wouldn’t recognise any legitimate dissent’. He remembers his teachers as arbitrary tyrants who taught classes about democracy but couldn’t be voted out of their jobs by their students. At a certain point he realised he could still get Bs or Cs without turning in any of his homework (worth only 5 per cent of the final grade) by aceing his quizzes and earning maximum points for class participation. He wanted to spend all his available time at the computer. When a teacher asked him why he wasn’t doing any homework, he was honest and explained his system (or ‘hack’, as he calls it). The school’s administrators reacted by changing the rules so that failure to turn in a certain number of assignments automatically resulted in a failing grade. But Snowden wasn’t long for high school in any case. As a sophomore he contracted mononucleosis, missed four months of classes and was told he would have to repeat the year. He dropped out instead and enrolled at a community college, where he pursued a General Educational Development degree.

Here the book moves away from the sensuous evocation of a boy’s online dreamworld to describe the grittier life of the IT professional. The outlines of the story are familiar enough, since Snowden has been a subject of journalistic scrutiny for years and a hero of movies, including Oliver Stone’s somewhat misleading biopic. Snowden’s first job was as a web designer for a woman he had met in a Japanese class, a fellow anime enthusiast. They fell out after 9/11: he was all for the war on terror; she, a dove, moved to California. Snowden’s politics at the time go mostly unexamined: he calls them a ‘mash-up of the values I was raised with and the ideals I encountered online’. He was now 18, and it was common enough then for Americans to back Bush in his every undertaking. In this Snowden was little different from two of the last three Democratic presidential nominees or the editors of the New Yorker.

Snowden enlisted in the army. For his family this was a major rebellion: they were Coast Guard people and thought of the army leadership as ‘the crazy uncles of the US military’. His mother cried and his father told him he was wasting his technical talents. But he wanted ‘to be praised for and to succeed at something else – something that was harder for me. I wanted to show that I wasn’t just a brain in a jar; I was also heart and muscle.’ Before basic training he qualified to be a Special Forces sergeant through a programme called 18 X-Ray, which was designed to ‘augment the ranks of the small flexible units that were doing the hardest fighting in America’s increasingly shadowy and disparate wars’. One senses that the gamer in Snowden got the better of him here. He had hardly been a sporty child, and although he was fit, his build was slight. This was an advantage when it came to doing push-ups (not much to lift), but the regimen was withering. ‘The army makes its fighters by first training the fight out of them until they’re too weak to care, or to do anything besides obey.’ A few weeks into basic training, on a manoeuvre in the woods in boots that were too big for him, he slipped while trying to avoid a snake and broke his leg – an injury that threatened to derail him from the Special Forces track. On doctors’ advice he accepted an ‘administrative separation’ and left the army without dishonour; for its part, the army was free of any liability for a disability claim. They let him keep his crutches.

‘I was ready to face the facts,’ Snowden writes. ‘If I still had the urge to serve my country, and I most certainly did, then I’d have to serve it through my head and my hands – through computing.’ While laid up at his mother’s place he applied for Top Secret/Sensitive Compartmented Information clearance, the type ‘required by positions with the top-tier agencies – CIA and NSA’. He worried about his online footprint:

the endless conveyor belt of stupid jingoistic things I’d said, and the even stupider misanthropic opinions I’d abandoned, in the process of growing up online. Specifically, I was worried about my chat logs and forum posts, all the supremely moronic commentary that I’d sprayed across a score of gaming and hacker sites. Writing pseudonymously had meant writing freely, but often thoughtlessly. And since a major aspect of early internet culture was competing with others to say the most inflammatory thing, I’d never hesitate to advocate, say, bombing a country that taxed video games, or corralling people who didn’t like anime into re-education camps. Nobody on those sites took any of it seriously, least of all myself.

He dreaded having to explain himself ‘to a grey-haired man in horn-rimmed glasses peering over a giant folder labelled PERMANENT RECORD’. He was repelled by the ‘overheated, hormonal opinions’ of his younger self. He considered coding a script that would programatically wipe everything he’d written from every site but decided against it: he ‘didn’t want to live in a world where everyone had to pretend that they were perfect’. There’s a little naivety here. ‘We can’t erase the things that shame us, or the ways we’ve shamed ourselves, online. All we can do is control our reactions – whether we let the past oppress us, or accept its lessons, grow, and move on.’ I agree with the sentiment, but the question is how to define ‘we’. Employees of the New York Times may be ready to own up to shameful jokes they made as undergraduates, but that does not stop right-wing operatives combing through their social media trails in an attempt to get them fired. A general amnesty for having once been an asshole doesn’t seem to be on the cards. As for Snowden, he passed his polygraph and got his clearance. The intemperate things he’d said online and his teenage hack of the Los Alamos lab were never discussed. Around this time he also met his girlfriend, Lindsay, through the website HotOrNot.com: he rated her a ten; she gave him an eight. Now they’re married, and unashamed of their dating profiles.

*

At this point the thriller plot begins its gradual build-up, with Snowden’s journey eventually taking him around the world: Geneva, Tokyo, Hawaii, Hong Kong, Moscow. Crucially, at an early stage, Snowden became a systems administrator, rather than a software or network specialist. This would lead him to an ‘intense engagement with the deepest levels of integration of computing technology’: the job was to look at the whole picture while others concerned themselves with parts. Then there were the technicalities of his employment status. Snowden explains at some length the combination of legislative and budgetary incentives that have caused the US intelligence community to rely on private contractors in ever larger numbers at the expense of career civil servants. The law puts a cap on the number of permanent employees the agencies can hire, but by using temporary contractors they can keep their manpower elastic as their budgets expand. There are no pensions for such workers but the pay tends to be much better and – as with bureaucrats and lawmakers who line their pockets while out of office – a revolving door allows contractors to make good money in the private sector once they have a security clearance: a classic neoliberal mix of precarity and profiteering. In order to get that clearance you need first to work for the government, but most of its hires quickly leave for a private employer that will contract them back to federal agencies. There’s some bitterness in Snowden’s explanation; he wasn’t able to serve his country the way his father and grandfather had: ‘The federal government was less the ultimate authority than the ultimate client.’ Then there’s the fact that after his leaks he was smeared in the press for his status as a mere contractor, as if he were a fly-by-night temp rather than a career cyberspy.

The career began on the night shift as a systems administrator for the NSA’s new Centre for Advanced Study of Language. A joint project with the University of Maryland, the centre wasn’t yet operational so there wasn’t much to do. Bored, Snowden went to a job fair and, as a nominal employee of the technology company Cosmo, he became a subcontractor for the CIA at its headquarters in McLean, Virginia (Langley is a historical holdover used by Hollywood). During the job interview his corporate manager, whom he would never see again, talked up his salary requirements because it would raise the firm’s fees. His fellow trainees were all ‘computer dudes’ with tattoos, dyed hair and piercings. The scene of a gaggle of wan hipster freaks being handed responsibility for the American spy cult’s technostructure lends an air of inevitability to Snowden’s future revelations.

After nine months in McLean, Snowden was itinerant. He had a stint at a training facility in rural Virginia for ‘commo guys’ being sent abroad, then applied for a post in Iraq or Afghanistan. He was assigned instead to Geneva, possibly because he had protested against the working conditions he and his fellow trainees endured (no overtime, no leave, being put up in a crumbling Comfort Inn). In Switzerland he had his first and only experience of human intelligence work, after tipping off his commanding officer about the potential usefulness of a Saudi banker he had met at a dinner party. The boss responded by taking the banker out, getting him loaded and then tipping off the police that he was driving home drunk. When the man’s licence was suspended and he was fined, the agent offered him a loan and a daily lift to his office so that his superiors wouldn’t find out about his arrest. But the banker refused to be turned and moved back to Saudi Arabia. ‘It was a waste,’ Snowden writes, ‘which I myself had put in motion and was powerless to stop.’

In 2009 he moved to Tokyo to work at the NSA’s Pacific Technical Centre, a facility tasked with intercepting communications in the region and sharing some of the resulting intelligence with governments friendly to the US. He was responsible for designing a backup system that would allow the NSA to transmit its data efficiently and store it for as long as possible. Ideally it would be stored for ever: ‘The NSA’s conventional wisdom was that there was no point in collecting anything unless they could store it until it was useful, and there was no way to predict when exactly that would be.’ While there, he was also assigned to deliver a talk at a conference on countersurveillance measures being taken against China, a state whose cyber capabilities were ‘mind-boggling’. He believed there was a crucial distinction between China and America, which condemned the one and redeemed the other: in principle, the US never used its powers of surveillance against its own citizens, except at times of emergency. But then came news that Obama, for whom his girlfriend had campaigned, had refused to curtail the Bush-era programme of warrantless wiretapping that targeted US citizens in their communications with foreigners. Snowden found this troubling.

When the classified version of the 2009 Report on the President’s Surveillance Programme came to his attention – it was his job as a sysadmin to wipe it from a low-security folder where it didn’t belong – he learned that what had been hidden from the public (in the unclassified version of the report) was a project codenamed STELLARWIND, which involved the bulk collection of internet communications:

The US government was developing the capacity of an eternal law enforcement agency. At any time, the government could dig through the past communications of anyone it wanted to victimise in search of a crime (and everybody’s communications contain evidence of something). At any point, for all perpetuity, any new administration – any future rogue head of the NSA – could just show up to work and, as easily as flicking a switch, instantly track anybody with a phone or a computer, know who they were, where they were, what they were doing with whom, and what they had ever done in the past.

Snowden makes the case that metadata – the information that our devices generate about what we do with them and what they do without our awareness – is ‘not some benign abstraction, but the very essence of content: it is precisely the first line of information that the party surveilling you requires.’ After learning of STELLARWIND, Snowden fell into a depression: ‘I felt far from home, but monitored. I felt more adult than ever, but also cursed with the knowledge that all of us had been reduced to something like children, who’d been forced to live the rest of our lives under omniscient parental supervision.’

During a brief return to Washington, after four years abroad, Snowden found that he and Lindsay were suddenly prosperous: they moved into a three-storey townhouse and he wore nice suits. For Valentine’s Day he bought her the revolver she’d always wanted. (They were never exactly a pair of pious liberals.) But he noticed that appliance stores were now selling internet-connected ‘smart’ fridges. He inferred that the only true purpose of such developments could be to enable the corporate collection of household data, yet everybody was signing up for it, avidly. The liberating internet of his youth was really gone, and now even the simplest visit online was a fraught activity. In his neighbourhood he noticed security cameras everywhere and licence-plate readers at every traffic light. He started imagining ‘a world in which all laws were totally enforced, automatically, by computers’, and ‘no policing algorithm would ever be programmed, even if it could be, towards leniency or forgiveness.’ He advised Lindsay to delete her Facebook account. ‘If I did that,’ she said, ‘I’d be giving up my art and abandoning my friends.’ He tried to tell her without telling her what he’d learned, and suggested she imagine a ‘Spreadsheet of Total Destruction’ with ‘every speck of information that could destroy your life’. She wouldn’t tell him what the most incriminating item on her list would be. He started having dizzy spells, falling off ladders, dropping spoons. It wasn’t just the paranoia. He was diagnosed with epilepsy.

The couple moved to Hawaii in 2012 mostly because of concern about his health: the climate and lifestyle would be good for him, and he could cycle to work at an NSA facility called the Tunnel, a converted aircraft factory that now served as a hub for the monitoring of communications between the continental US and Asia. He worked for the Office of Information Sharing, technically its only employee, administering the system that determined who could see which documents: ‘The NSA had made me the manager of document management … my job was to know what sharable information was out there.’ He says his decision to investigate NSA abuses came around the time of his 29th birthday, and rather than just copying documents he started reading them. He set up an automated ‘readboard’, a sort of internal news blog, that gathered agency documents and made them available to anyone with the proper security clearances and copied them to a server that Snowden managed. He soon learned of the existence and mechanics of the PRISM programme, which allowed the NSA to collect data from Microsoft, Google, Facebook et al via a FISA court order, and ‘upstream collection’, which

enabled the routine capturing of data directly from private-sector internet infrastructure – the switches and routers that shunt internet traffic worldwide via the satellites in orbit and the high-capacity fibre-optic cables that run under the ocean. This collection was managed by the NSA’s Special Sources Operation unit, which built secret wire-tapping equipment and embedded it inside the corporate facilities of obliging internet service providers around the world.

With upstream collection, the agency’s algorithms trawl through the vast body of collected data to hunt out evidence of particular activities deemed to be of interest to the NSA. This might be as simple as someone, somewhere, searching for a certain keyword – ‘protest’, say, or ‘bomb’. As soon as such activity is detected, users can be targeted with malware injected into their internet traffic that will allow the collection of the entirety of a device’s data: everything downloaded onto a phone, every call you make, every site you visit. Agency malfeasance had become inhuman, automatic, structural.

In the last stage of his investigation, after copying his files to data cards he smuggled into the office in a Rubik’s cube, Snowden took a new job in order to gain access to a program called XKEYSCORE, ‘a search engine that lets an analyst search through all the records of your life’. During a spell of training at NSA headquarters in Maryland, he witnessed analysts showing one another nudes of the subjects they monitored and engaging in LOVEINT, personal surveillance of their past or current lovers, an illegal activity but one for which no one has ever been prosecuted, because the analysts know how not to get caught – and because prosecuting them would reveal the extent of the agency’s surveillance capabilities and their potential for abuse. He looked through ‘the shared targeting folders of a “persona” analyst’ and found an Indonesian engineer who’d been watched because he’d applied for an academic job at a university in Iran. The man’s home videos with his young son reminded him of himself and his own father, and he suddenly realised that the documents he was about to leak would separate him from his family for ever.

*

Snowden had reason to believe that those leaks might have little impact. In the summer of 2004, James Risen and Eric Lichtblau of the New York Times had been ready to publish a report on the NSA’s Bush-era surveillance programme, including an aspect of STELLARWIND, but the administration put in calls to the paper’s editor, Bill Keller, and to its publisher, Arthur Sulzberger, suggesting that they halt publication in the interest of national security. By the time the report finally appeared, in December 2005, Bush had already been safely re-elected. More recently, the press had largely ignored stories hiding in plain sight: the construction by the NSA of a new massive data repository in Utah, which was covered only by the journalist James Bamford; and a talk at a public convention in New York by a top CIA technician who told journalists that ‘it is nearly within our grasp to compute on all human generated information’ – a statement that generated a single item in the Huffington Post. It was the drama of Snowden’s flight to Hong Kong, his personal explanations of the documents he had delivered to Glenn Greenwald, Laura Poitras and Ewan MacAskill, and the young stubbly face he put to the story in videos filmed in his hotel room, that caused his revelations to have the effect they did. What the world learned from Snowden from June 2013 onwards helped lead to the passing of US laws against the bulk collection of communications data, to encryption by default on iPhones and Android devices, and to the widespread encryption of web traffic (signified by the ‘https://’ that now prefaces almost all URLs). But laws only apply in the countries where they’re passed, and technologies, including modes of encryption, quickly become obsolete. Ignorance, apathy and laziness will leave most individuals open to whatever new forms of mass surveillance governments and corporations can invent. Those whose business is surveillance will always do what it takes to stay one step ahead.

In revealing everything he did, including his own identity, to the probable detriment of his health, wealth and sanity, Snowden was also violating his own privacy. Permanent Record takes that self-violation as far as it can go. About the other characters in his life – his parents, his colleagues, a fellow army trainee who tells him he’s about to go AWOL and then makes a run for it while in the latrine, an old-timer at CIA headquarters whose main job seems to have been to change a tape in an outdated recording system every night – there is just enough to provide the narrative with colour. Lindsay is an exception. One chapter includes her diary entries from the days when Snowden was a missing person, as the FBI harassed her and the press smeared her as a stripper on the basis of social media posts from her pole-dance fitness class. But really this is a book with a single hero. From the scene of the boy watching his father repair his Nintendo box at a Coast Guard base, to the scene of the man on the run teaching journalists how not to be spied on (seal your smartphone in a plastic bag and put it in the hotel room minifridge): out of all this a coherent character emerges, someone whose wonder at machines has, through his own mastery of them, turned into a kind of horror. Most of the rest of us still clutch them with giddiness, incomprehension at their workings, and an insatiable need for the next shiny thing that pops up on the screen.

At a lounge in Sheremetyevo airport in Moscow, where Snowden had fled as he tried to make his way to Ecuador, an FSB agent gave him a cold pitch to work for Russian intelligence. Snowden immediately cut him off; anyway, he’d already given all the documents away to journalists. The agent then informed him that his passport had become invalid during his flight from Hong Kong – a true airport thriller scene. (In his acknowledgments, Snowden thanks the novelist Joshua Cohen for ‘taking me to writing school’; Cohen’s 2015 novel Book of Numbers is about the ghostwriter of the memoir of a tech billionaire who leaves the grid, making the new book a case of life imitating fiction.) Snowden still faces three felony charges, two of them under the Espionage Act, under which the government can withhold public presentation of the evidence against the defendant by claiming the interests of national security. Obama declined to pardon him. In the wake of Russiagate, it’s common to hear people, even left-leaning journalists, speculate that Snowden has been a Russian agent all along, especially now that the US intelligence services have rebranded themselves as the strongest institutional branch of the #Resistance to the current president. Meanwhile the government’s machines of repression, those carefully cooled datacentres, hum along in Maryland, Utah and Hawaii.
Enjoyed your first free article?
Celebrate 40 years of the LRB by subscribing at our special rate of £40 and get unlimited access to our complete archive
Subscribe Now
Already a subscriber? Log in

Contact us for rights and issues inquiries.

facebook twitter emailletterciteprint

More from this issue »
More by this contributor »

Current Issue
Blog
Archive
Audio and Video
Search

About
Terms and Conditions
Copyright
Privacy
Accessibility

Subscribe
Contact
Newsletters
FAQs
Librarians

Back to the top
Follow the LRB
Facebook Twitter Google Plus

© LRB Limited 2019
ISSN 0260-9592
Send Us Feedback

You have 3 free articles left this month.
Get unlimited access. Subscribe Now
Already a subscriber? Log in