Some thoughts on the Film Oppenheimer

The Bomb as a Game Changer

As regular readers will know, the Technology Bloggers platform has a partnership with the Bassetti Foundation. As part of my own collaboration with the Foundation I edited the International Handbook on Responsible Innovation, and in this book Foundation President Piero Bassetti explains that innovation requires a surplus of knowledge alongside a surplus of power.

This argument was not new for him though, being addressed in his book Le Redini del Potere (the reins of power) written with Giacomo Corna Pellegrini back in 1959.

In this book (from a time of rapid change when Fidel Castro became President of Cuba, the first two primates survived space flight, and nylon tights (pantyhose) were released to the public), the authors discuss the decision taken by then President Franklin Roosevelt to pursue research into a weapon that for the first time could bring humanity itself to an end.

This decision is seen as a development point in the relationship between science and politics and the notion of collective responsibility that underpin the Bassetti Foundation’s mission to promote responsibility in innovation.

This surplus of knowledge and power is something that can be clearly seen in the latest Oppenheimer film, as the knowledge surplus is created by gathering the world’s greatest scientific minds together, all carried out under the drive and with the funding of the US government (the surplus of power). The US army offers the infrastructure to put the whole plan together.

Without the political will and capability to carry out the project, the surplus of knowledge remains just that, knowledge. For it to become (an) innovation, it has to change something, to be implemented, which brings in the influence of power, money, and in the old days at least, government.

This brings the type of questions about responsibility that we have been asking in the Bassetti Foundation for the last thirty years, and which are related to its approach and interests. If we follow Bassetti’s line of thinking as outlined in the Handbook, knowledge remains knowledge in the absence of political will and capacity, so responsibility must lie with the political decision-makers, or in other words, with power.

A single line expresses this idea in the new Oppenheimer film, uttered by Donald Truman, the US President who took the decision to drop the bombs over Japan: ‘those people is Hiroshima and Nagasaki aren’t interested in who built the bomb, they are interested in who dropped it’.  

Who is Responsible, The Individual or the Position?

In the case in question the US President is claiming the responsibility for the dropping of the bomb, but if we follow Bassetti, as President he also in some way ‘represents’ responsibility for the discovery of the bomb itself, even though the process was started by his predecessor. From some perspectives (those that see a ‘many hands’ problem), the discovery and production process brings joint responsibility; it requires military personnel and logistic capacities, scientists as well as finance, good will from family members, collaboration and political support. But we could also say that the process is fundamentally political and facilitated by power, the same power decides to facilitate, design and implement the process, and then decides what to do with the results.

This point of who controls the process (and therefore is responsible for it) comes up once more in the film, as Oppenheimer (having delivered a bomb to the military for use) starts to tell a soldier the effects of exploding the bomb at different altitudes. The soldier responds by making it clear that the military would be taking all of the decisions from then on, and they would decide on the logistics. Once the bomb was ready, it was made clear to the scientists that they did not have any say in how it might be used. It was never their bomb and their role had been completed.

Another interesting element of the film develops as Oppenheimer moves to limit the effects of the invention. He proposes the need to share knowledge of the discovery with the allies (Russians), to propose a moratorium and international governance of the new weapon, and to halt further developments that would lead to an arms race. If we want to bring this into the present there has recently been lots of debate about the how to govern developments in AI, including about a possible moratorium.

Rather than just seeing this as a problem of care, it can also be seen from the point of view of how perceptions of responsibility change over time. During a war (although there is some discussion about the bomb being unnecessary as the German government surrenders) the development of such a weapon is justified, even seen as necessary. But once the war is won, or almost won, its existence should be problematized.

Starlink

Returning to present day developments, the press that Elon Musk received back in September and revelations made in a recent book about his Starlink project brings up several similar questions. Whatever the truth is about denied requests to Starlink to facilitate an attack on the Russian Black Sea fleet, Musk finds himself and his company participating in warfare. Echoing the position that Oppenheimer finds himself in (as portrayed in the film), he remarks that the purpose of Starlink was not to facilitate war but to facilitate gaming and internet access for all. But once the technology is available, its use may be difficult to determine by those who enabled it.

The problem of many hands is not as evident to see in this situation however. Starlink resembles a family business, the surplus of knowledge and the surplus of power, will and capability all lie within the hands of one person. I have not heard any talk of a moratorium, or international governance for that matter, which raises several fundamental questions: What is the role for governance in this situation? Or the role of political will or finance? What are the implications for thinking about democracy? Where should Responsible Innovation practices be focused if there is a lack of external governance mechanisms? What are the implications of the fact that both sides in this war rely on Starlink to facilitate their actions?

Could we see Elon musk as playing a multifaceted role, of innovator and politician, mediator and strategist?

Computing Within Limits

LIMITS

I have just attended LIMITS 22, the eight annual workshop on computing within limits.

As the name suggests, the workshop addresses the role of computing in human societies affected by real-world limits, for example limits of extractive logics, limits to a biosphere’s ability to recover, limits to our knowledge, or limits to technological “solutions”.

Very much tied to the interests of the TechnologyBloggers website, this collection of researchers and practitioners aim to reshape the computing research agenda, grounded by an awareness that contemporary computing research is intertwined with ecological limits in general and climate- and climate justice-related limits in particular.

This was a virtual distributed workshop, with many participants joining hubs so that they could avoid travel but still attend a social event. I touched upon this as a model in my post about conferencing a few weeks ago.

I attended one of such hubs in Rotterdam (Netherlands), held at Varia, a space for developing collective approaches to everyday technology. There were a dozen people there, computer programmers, university lecturers and students and the likes, which made for interesting discussion during the break-out sessions and a very nice social mix.

I won’t go into the individual presentations too much, but would like to highlight a few of the questions addressed and point readers towards some resources.

What is the carbon footprint of streaming media?

Researchers estimate that streaming media accounts for about 1% of global carbon emissions. These emissions are created throughout the chain, with only a small percentage visible to users (the electricity that appears on their household bills), the vast majority hidden as it is produced during data storage, cooling, delivery, maintaining back-up systems and during a miriad of other processes (not to mention construction, mining of raw materials, etc).

This website offers lots of information, beginning with the startling revelation that ICT in general is estimated to use about 7% of all electricity used, so may contribute (depending how the electricity is produced) to up to almost 4% of global greenhouse gasses.

So the actual carbon footprint is very difficult to measure, with a range offered for watching a streamed film as equivalent to burning between 1.2 and 164 kilos of coal (depending on your calculations and not the film).

The large data centres providers often claim that they use clean energy for their centres, but this was also questioned as their mass use of this energy has been shown to monopolize access, at very least having an enormous effect on the local networks and sometimes resulting in others having to use fossil fonts,. Their green claims were described as cherry-picked.

Digital platforms

Well we all love a digital platform don’t we? Facilitating car sharing, what could be better than that? Well even here a critical perspective appears, as we have to add ICT emissions to real emissions if calculating the possible environmental implications. And not only that! For example, using one car instead of two halves the emissions for analytical purposes, but on top of this we should add the ICT emissions (which as we know are difficult to work out). But we can come up with an estimate. Then behavioural change might also come into play. People might drive further because they are sharing, some will share a car and leave the bike at home or not take the usual train. It all becomes rather murky.

Other discussions

Other questions arose: what are the implications of framing the discussion in terms of limits, rather than abundance? Could such a reframing bring in an ethics of care? Can we discuss the relationship between humans and nature and its ties to capitalism? What role can religion take? How important are imaginaries of the (technological) future? Does the public have the information required to understand the environmental implications of their choices?

As you can see, it was very stimulating.

Check out this website for a perspective.

And the Chaos Computer Club for another.

The papers are all available here so fill your boots.

Part 2 The Responsible Innovation Circus Rolls into Town

A lighthearted view of Responsible Innovation

Alternative Teaching Methods for Responsible Innovation

There are many ways of presenting complex arguments about ethics, and some interesting examples of theatre in use. The embedded video above was made for the final closing conference of a European Union funded project ROSIE.

The video presents a light-hearted look at the problems faced when trying to introduce the RI concept to small businesses. It was made in character (following the tradition of action theatre in academic use as a teaching tool for ethics) and addresses the problem of the gap found between the language used in RI publications (of all sorts) and that used in the small business world.

It uses a circus metaphor to represent the balancing act prescribed through various EU and Standards documentation related to Responsible Innovation, using the balance metaphor for a high-wire walk and their prescribed goals and approaches as juggling balls.

In making this video I was very much influenced by a US based university professor who uses radio plays, theatre and art in his teaching that he produces himself.

Richard G Epstein

Richard G Epstein works in the Department of Computer Science, West Chester University of PA, where he teaches courses on computer science, software engineering and computer security and ethics.

His university home page provides access to some of his publications, and I would like to have a look at three of them. He produces teaching aids that are extremely entertaining and require no technical understanding.

Artistic Work

The Case Of The Killer Robot is a series of fake newspaper articles that report the story of an accident at work involving an industrial robot and its operator. The early articles are descriptions of the accident but as the work moves on it slides into legal and ethical territory. Who is actually guilty for the malfunctioning of the machine and who should be held legally responsible? One of the programmers is found to have misunderstood a piece of code scrawled on a post-it and his translation error is deemed to have caused the death, so he is charged with manslaughter.

The reporters visit the factory and interview fellow workers, producing articles within which they reveal problems within the organization and managerial team and slander the poor programmer’s personality, all in perfect local journalism hack style. They have expert interviews and uncover both design faults and personal differences between members of the development strategy team.

The issue of responsibility is really brought to the fore, although in a fictional setting the ethical dilemmas faced during the development and working practices involved are laid bare. It makes for a very entertaining and thought provoking read.

The Plays

The second work I recommend is one of the author’s plays entitled The Sunshine Borgs. The play is set in the near future and tells the story of a bitter ex-playwright who has lost his job and seen the demise of human participation in the arts caused by the development of computer programmes that write plays and music for human consumption.

The play investigates the threat that high power “intelligent” computing could pose to human creativity. Robots have taken over as actors, lovers, authors and just about everything else. Pain and suffering, poverty and crime have been all but eradicated but has humanity lost its passion? The play contains a twist that I won’t reveal and the writing style even manages to portray the effect that working in a computer environment can have on language use and thought processes.

The Author describes this work as a comedy but it is too close to the bone to make you really laugh. Questions such as the legal rights of robots and the possibility of charging a human with robot abuse are raised when the main character’s unwanted robot companion commits suicide as a result of the playwright trying to educate a soul into his hated but extremely useful houseguest.


Another of his plays entitled NanoBytes addresses the problem of nanotechnology, an interesting story of the head of a nanotechnologies company and a small mishap regarding molecular sized computers that can be injected into people in order to stop anti-social behaviour. A much shorter read but with some equally interesting twists and an insightful tongue in cheek description of American family and business life.