in ACM communicationsGoogle's VP of Education points out how calculators have impacted math education and wonders if generative AI will have the same impact on CS education.
Teachers had to find the right amount of handwritten arithmetic and math problem solving for their students to give them the “number sense” to succeed in algebra and calculus. If you concentrate too much on your calculator, your sense of numbers will become dull. A similar situation arises when students determine the “code sense” they need to succeed in this new realm of automated software engineering. It will take several iterations to understand exactly what practices students need to develop sufficient code sense in this new era of LLM, but now is the time to experiment. ”
Longtime Slashdot reader theodp points out that this isn't the first time Google executives have had to consider “iterating” their curriculum.
The CACM article echoes previous comments made by Google's VP of Education during a high-profile talk called “The Future of Computational Thinking” at last year's Blockly Summit. (Blockly is a Google technology that powers drag-and-drop coding IDEs used in K-12 CS education, such as Scratch and Code.org.) Imagining a world where AI generates code and humans proofread it, Johnson explained: “You can imagine a future where these generative coding systems are so reliable, so capable, so secure, that the time it takes to do both low-level coding actually decreases. As a result, we're seeing students focus more on reading, understanding, and evaluating the generated code than actually writing the code. […] I don't think the need to understand code will disappear completely anytime soon […] I think there will still be a need to read and understand code so that the reliability and correctness of the generated code can be evaluated, at least in the short term. So I think there is still a need for it in the short term. ” In his next Q&A, Johnson was surprised by a question about whether he even needs Blockly at all in an AI-driven world like the one being described. — and a Google vice president admits that might not be the case.