To accompany “Training AI to be really smart poses risks to climate”
1. The word “smart” has been applied to some digital devices, such as phones and speakers. What do people mean when they call these devices smart?
2. Different types of electricity generation emit different amounts of greenhouse gases. Why types emit the most? Why types emit the least?
1. What is AI? Give some examples of it from the story. (Extra points if you can name some additional examples not in the story.)
2. What are CO2 equivalents and why did scientists come up with such a term? Why are CO2 equivalents talked about in relation to climate change?
3. What is deep learning in AI?
4. What are GPUs (as defined in the story) and why do they use a lot of energy?
5. What are parameters, as they relate to AI, and how many of them might be used in huge language-processing AI programs?
6. What issues did a March 21 paper by Emily Bender and others raise about AI language models?
7. Today, developers tend to measure the value of AI systems by how accurately those systems do their job. And that’s certainly important. But why does Roy Schwartz argue this isn’t a sufficient measure of how good an AI program is? On what other features would he like to see AI programs also be judged?
1. In what ways can AI be green, or good for the environment? Give some examples. In what way can it be not-so-good for the environment? Give at least two examples of this.
2. When AI works well, it looks for patterns and makes independent decisions based on patterns that it recognized. Imagine you work on an AI-development team. Brainstorm with a partner about two things you’d like to develop an AI system to do.