There have been a lot of changes to this unit. Those changes were well described in a timely manner. As always, I am grateful for the competence and professionalism of code.org.
I am concerned about the bias in the parts that are left. In two different 3rd party videos “predictive policing” is described as an application of big data and artificial intelligence. In neither case is predictive policing sufficiently critiqued. There is a strong and well understood body of work critiquing predictive policing as a tool that that exacerbates existing bias in policing specifically it perpetuates and codifies a negative bias against blacks.
Additionally, in the WSJ opinion piece “It’s Modern Trade: Web Users Get as Much as They Give”, students are presented with an opinion that is very corporate friendly. That is of course a good and reasonable thing to do. And, it would be good and reasonable to represent an opposing opinion especially given our world.
The reason why this class is important is to give students the tools they need to think critically about their world. To quote the AP test Explore Task: “Computing innovations impact our lives in ways that require considerable study and reflection for us to fully understand them.” For example, last year we exposed how social media advertising, micro-targeting and filter bubbles were used to proliferate “fake news”. I would love to see this as a counterpoint to the WSJ piece. Maybe that’s just my job?
Finally, I would like to address Big Data and Artificial Intelligence. In the Code.org curriculum resources like those mentioned above, Artificial Intelligence and automation in general is represented as inevitable and faceless as in “the robots will take our jobs”. Zuckerberg did this in his testimony before conference. When he was asked how he will deal with flagging hate speech he said “AI”. The truth is that artificial intelligence is just a computer program written by unavoidably biased humans. (see predictive policing) The computers are not taking our jobs. More accurately, jobs are be eliminated by companies that are deploying technology that automates tasks that were formerly done by humans. Similarly, companies hire humans to write AI programs which are exactly as flawed as the people and institutions that create them. What Zuckerberg was actually saying was not “AI will fix it” but “Facebook will fix it”. Except that is not as convincing because Facebook made the problem in the first place.