How to use AI to prepare for a job interview

AI isn’t just for big organisations, there are many ways you can use AI tools to help improve your speed and effectiveness at completing everyday tasks too.

One example is to help you prepare for a job interview. It can be difficult to know how to spend your time effectively, so why not enlist the help of an AI chatbot like Bard or ChatGPT to come up with ideas, and better still, get it to do some of the work for you?

Job Spec

Where to start? First, try feeding ChatGPT the job description you’ve been given, and ask it to suggest how you should approach interview prep. Here are a few examples of prompts to try:

Prompt 1

I've got an interview for the attached role, could you give me a 50 word summary of how you'd advise I prepare?

If you can’t upload documents, just copy the text after the prompt instead.

Prompt 2

I need to prep for an interview tomorrow. I only have 20 mins. Please read the background info doc (attached) and give me a list of 5-10 bullet points walking me through how to use the 20 minutes to get ready. The list should take me no more than 3 minutes to read.

The more specific you are on your requirements, the better the answer will be. For example, you could ask for it to answer in a numbered list, or to display it in a table. You could ask for it to be no more than 50 words, or for it to be written so that it could be easily understood by a five year old.

Prompt 3

Help! I've got an interview and I'm really out of practice. Could you read the job spec and give me some pointers on what I should be doing?

Sometimes being more creative with your prompt can lead to a more interesting response from the AI.

An example conversation you could have with ChatGPT

Gather More Info

Don’t forget, if you don’t get the response you want, or you need more information, try using follow-up prompts. For example, if it tells you to check out the organisation’s website, ask it to do the legwork for you, here are some examples of how you can ask that.

Follow-up prompt 1

That's great, thanks. Here's the library's website, could you read it and provide me a consise summary that I can use to quickly get up to speed before my interview? www.example-library.org/about

Follow-up prompt 2

I don't know if they have a website actually. Could you search the web and see if you can find one? If they don't, how would you suggest I research the libary more, in the most time effective way, given I've only got 20 minutes to prepair?

Most AI chatbots are designed to understand natural language, so write back as though you were talking to a person. Good spelling and grammar will help you get a better response the first time around, but don’t be afraid to be creative with the instructions you use.

Caution: be mindful not to share personal information with an AI chatbot. If you wouldn’t tell something to a random stranger, don’t type (or paste) it into a chatbot either.

Questions

In any interview, you’re going to be asked questions, so why not practice with the help of AI?

Question prompt 1

I haven't interviewed for this sort of position before, could you give me a list of 5 questions that you think I could be asked, based on the job description?

Question prompt 2

Let's role play the interview, you ask me a question the interviewer is likely to ask, and I'll type my reply. You can then give me feedback on my answer and tips to improve.

Question prompt 3

Could you provide a list of the 5 most common interview questions and 2-3 bullets on how to answer them?

It’s also a good idea to turn up with a few questions of your own. You can ask the AI to help you format these, or if you’re running short of ideas, ask it to give you some suggestions!

Follow-up questions prompt

What questions could I ask the interviewer to help them see I'm interested in the role and want to work for this organisation?

Sharing Ideas

Do you have any tips on how to use ChatGPT to prepare for an interview? Why not help others too, by sharing your prompt ideas below.

If you’re early to this post and the comments section is a little empty, why not ask Bard to come up with some ideas for you instead!

Asking Bard how to prompt ChatGPT to get ready for a job interview

Some thoughts on the Film Oppenheimer

The Bomb as a Game Changer

As regular readers will know, the Technology Bloggers platform has a partnership with the Bassetti Foundation. As part of my own collaboration with the Foundation I edited the International Handbook on Responsible Innovation, and in this book Foundation President Piero Bassetti explains that innovation requires a surplus of knowledge alongside a surplus of power.

This argument was not new for him though, being addressed in his book Le Redini del Potere (the reins of power) written with Giacomo Corna Pellegrini back in 1959.

In this book (from a time of rapid change when Fidel Castro became President of Cuba, the first two primates survived space flight, and nylon tights (pantyhose) were released to the public), the authors discuss the decision taken by then President Franklin Roosevelt to pursue research into a weapon that for the first time could bring humanity itself to an end.

This decision is seen as a development point in the relationship between science and politics and the notion of collective responsibility that underpin the Bassetti Foundation’s mission to promote responsibility in innovation.

This surplus of knowledge and power is something that can be clearly seen in the latest Oppenheimer film, as the knowledge surplus is created by gathering the world’s greatest scientific minds together, all carried out under the drive and with the funding of the US government (the surplus of power). The US army offers the infrastructure to put the whole plan together.

Without the political will and capability to carry out the project, the surplus of knowledge remains just that, knowledge. For it to become (an) innovation, it has to change something, to be implemented, which brings in the influence of power, money, and in the old days at least, government.

This brings the type of questions about responsibility that we have been asking in the Bassetti Foundation for the last thirty years, and which are related to its approach and interests. If we follow Bassetti’s line of thinking as outlined in the Handbook, knowledge remains knowledge in the absence of political will and capacity, so responsibility must lie with the political decision-makers, or in other words, with power.

A single line expresses this idea in the new Oppenheimer film, uttered by Donald Truman, the US President who took the decision to drop the bombs over Japan: ‘those people is Hiroshima and Nagasaki aren’t interested in who built the bomb, they are interested in who dropped it’.  

Who is Responsible, The Individual or the Position?

In the case in question the US President is claiming the responsibility for the dropping of the bomb, but if we follow Bassetti, as President he also in some way ‘represents’ responsibility for the discovery of the bomb itself, even though the process was started by his predecessor. From some perspectives (those that see a ‘many hands’ problem), the discovery and production process brings joint responsibility; it requires military personnel and logistic capacities, scientists as well as finance, good will from family members, collaboration and political support. But we could also say that the process is fundamentally political and facilitated by power, the same power decides to facilitate, design and implement the process, and then decides what to do with the results.

This point of who controls the process (and therefore is responsible for it) comes up once more in the film, as Oppenheimer (having delivered a bomb to the military for use) starts to tell a soldier the effects of exploding the bomb at different altitudes. The soldier responds by making it clear that the military would be taking all of the decisions from then on, and they would decide on the logistics. Once the bomb was ready, it was made clear to the scientists that they did not have any say in how it might be used. It was never their bomb and their role had been completed.

Another interesting element of the film develops as Oppenheimer moves to limit the effects of the invention. He proposes the need to share knowledge of the discovery with the allies (Russians), to propose a moratorium and international governance of the new weapon, and to halt further developments that would lead to an arms race. If we want to bring this into the present there has recently been lots of debate about the how to govern developments in AI, including about a possible moratorium.

Rather than just seeing this as a problem of care, it can also be seen from the point of view of how perceptions of responsibility change over time. During a war (although there is some discussion about the bomb being unnecessary as the German government surrenders) the development of such a weapon is justified, even seen as necessary. But once the war is won, or almost won, its existence should be problematized.

Starlink

Returning to present day developments, the press that Elon Musk received back in September and revelations made in a recent book about his Starlink project brings up several similar questions. Whatever the truth is about denied requests to Starlink to facilitate an attack on the Russian Black Sea fleet, Musk finds himself and his company participating in warfare. Echoing the position that Oppenheimer finds himself in (as portrayed in the film), he remarks that the purpose of Starlink was not to facilitate war but to facilitate gaming and internet access for all. But once the technology is available, its use may be difficult to determine by those who enabled it.

The problem of many hands is not as evident to see in this situation however. Starlink resembles a family business, the surplus of knowledge and the surplus of power, will and capability all lie within the hands of one person. I have not heard any talk of a moratorium, or international governance for that matter, which raises several fundamental questions: What is the role for governance in this situation? Or the role of political will or finance? What are the implications for thinking about democracy? Where should Responsible Innovation practices be focused if there is a lack of external governance mechanisms? What are the implications of the fact that both sides in this war rely on Starlink to facilitate their actions?

Could we see Elon musk as playing a multifaceted role, of innovator and politician, mediator and strategist?

Theatre in Responsible Innovation


Art in Responsible Innovation

As regular readers will know, I have been writing about artists raising ethical issues about technology for several years.

Back in 2021 I interviewed Maurizio Montalti, an artist/scientist who works with new fungus-based materials, and Rodolfo and Lija Groenewoud van Vliet of In4Art who talked about their Art Driven Innovation approach.

In the video above I speak to Mireia Bes Garcia from Bristol University and Oliver Langdon of Kilter Theatre about their collaborations on immersive theatre projects on quantum/virtual reality and synthetic biology.

What is really unique about this theatre approach is that the researchers were integral to the performances and workshops, in a process that offered them the tools to address ethical considerations in their work while relying on their expertise to develop the story and performances.

This wide-reaching conversation addresses issues such as considerations around university/artist collaborations and the advantages of developing long terms working relationships, as well as the some of the complexities of using theatre practice within Responsible Innovation approaches.