What is Permacomputing?

Permacomputing

In my last post I wrote about the Critical Infrastructure Lab launch event, held in Amsterdam. I attended the two day event and will soon write my report and post it on the Bassetti Foundation website, but I couldn’t wait to write about the most engaging and challenging things I came across, at a workshop led by Ola Bonati and Lucas Engelhardt: the concept and practices of permacomputing.

As you might imagine, the concept is related to the nature practices of permaculture, it encourages a more sustainable approach that not only takes into account energy use and hardware and software lifespans but also promotes the use of already available computational resources.

From the starting point that technology has harmed nature, the concept aims to re-center technology and practice and enter into better relations with the Earth.

Practitioners propose a series of research methods that include living labs (we promote this approach in Responsible Innovation research too), science critique, interdisciplinarity and artistic research, which as many readers will know is very close to my own heart. Fields include Ecosystems and computational conditions of biodiversity, Sustainability and toxicity of computation and Biodigitality and bioelectric energy.The Permacomputing network wiki contains the following principles (as well as going into much more detail of all of the above)

Care for life, Create low-power systems that strengthens the biosphere and use the wide-area network sparingly. Minimize the use of artificial energy, fossil fuels and mineral resources. Don’t create systems that obfuscate waste.

Care for the chips. Production of new computing hardware consumes a lot of energy and resources. Therefore, we need to maximize the lifespans of hardware components – especially microchips, because of their low material recyclability.

Keep it small. Small systems are more likely to have small hardware and energy requirements, as well as high understandability. They are easier to understand, manage, refactor and repurpose.

Hope for the best but prepare for the worst. It is a good practice to keep everything as resilient and collapse-tolerant as possible even if you don’t believe in these scenarios.

Keep it flexible. Flexibility means that a system can be used in a vast array of purposes, including ones it was not primarily designed for. Flexibility complements smallness and simplicity. In an ideal and elegant system, the three factors (smallness, simplicity and flexibility) support each other.

Build on solid ground. It is good to experiment with new ideas, concepts and languages, but depending on them is usually a bad idea. Appreciate mature technologies, clear ideas and well-understood theories when building something that is intended to last.

Amplify awareness. Computers were invented to assist people in their cognitive processes. “Intelligence amplification” was a good goal, but intelligence may also be used narrowly and blindly. It may therefore be a better idea to amplify awareness.

Expose everything. Don’t hide information!

Respond to changes. Computing systems should adapt to the changes in their operating environments (especially in relation to energy and heat). 24/7 availability of all parts of the system should not be required, and neither should a constant operating performance (e.g. networking speed).

Everything has its place. Be part of your local energy/matter circulations, ecosystems and cultures. Cherish locality, avoid centralization. Strengthen the local roots of the technology you use and create.

There is also a page of concepts and ideas that are needed to discuss permacomputing and a library. You can find links to projects, technology assessments and information about courses and workshops, as well as lots of communities to investigate and join, and how to contribute to the wiki.

Why not join the discussion and spread the word?

Responsible Computing Research Report

A Consensus Study Report

Earlier this year (2022) The committee on Responsible Computing Research of the National Academies of Sciences, Engineering and Medicine in the USA published a Consensus Study Report entitled Fostering Responsible Computing Research, Foundations and

The Committee members predominantly came from US universities (with one representative from Australia), but also included representatives from Microsoft, Google and the Carnegie Endowment for International Peace. It is a 100 page document, but the following conclusions and recommendations are all to be found in the summary.

Conclusions

The process led to the committee coming to three conclusions that underpin the report recommendations:  

Conclusion 1. To be responsible, computing research needs to expand to include consideration of ethical and societal impact concerns and determination of effective ways to address them.

Conclusion 2. To be responsible, computing research needs to engage the full spectrum of stakeholders and deploy rigorous methodologies and frameworks that have proven effective for identifying the complicated social dynamics that are relevant to these ethical and societal impact concerns.

Conclusion 3. For computing technologies to be used responsibly, governments need to establish policies and regulations to protect against adverse ethical and societal impacts. Computing researchers can assist by revealing limitations of their research results and identifying possible adverse impacts and needs for government intervention.

Recommendations

Recommendation 1. The computing research community should reshape the ways computing research is formulated and undertaken to ensure that ethical and societal consequences are considered and addressed appropriately from the start.

Recommendation 2. The computing research community should initiate projects that foster responsible computing research, including research that leads to societal benefits and ethical societal impact and research that helps avoid or mitigate negative outcomes and harms. Both research sponsors and research institutions should encourage and support the pursuit of such projects.

Recommendation 3. Universities, scientific and professional societies, and research and education sponsors should support the development of the expertise needed to integrate social and behavioral science and ethical thinking into computing research.

Recommendation 4. Computing research organizations—working with scientific and professional societies and research sponsors—should ensure that their computing faculty, students, and research staff have access to scholars with the expertise to advise them in examining potential ethical and societal implications of proposed and ongoing research activities, including ways to engage relevant groups of stakeholders. Computing researchers should seek out such advice.

Recommendation 5. Sponsors of computing research should require that ethical and societal considerations be interwoven into research proposals, evaluated in proposal review, and included in project reports.

Recommendation 6. Scientific and professional societies and other publishers of computing research should take steps to ensure that ethical and societal considerations are appropriately addressed in publications. The computing research community should likewise take steps to ensure that these considerations are appropriately addressed in the public release of artifacts.

Recommendation 7. Computing researchers who are involved in the development or deployment of systems should adhere to established best practices in the computing community for system design, oversight, and monitoring.

Recommendation 8. Research sponsors, research institutions, and scientific and professional societies should encourage computing researchers to engage with the public and with the public interest and support them in doing so.

All of which makes perfect sense, so why not take a look at the report yourself? There is lots of stuff about bias in data sets and other issues that we have addressed previously and It’s free to download here.

And take a look at what Mozilla have been doing here.