代写Ethics of AI Assignment 2代做回归
- 首页 >> Python编程Ethics of AI
Assignment 2
September 25, 2025
0 Instructions
• Total points: 10 out of 100.
• Include your full name and student number at the beginning of your submission.
• Deadline: 9 Oct 2025, 4:00 pm (as indicated on the Moodle assignment page)
• Submit a PDF file via Moodle (within 3 pages), using the Assignment 2 submission link.
• Please ensure compliance with the university’s Generative AI (GAI) policy. You may use GAI tools only for brainstorming or language polishing. But do not use them to directly answer the questions. Also, if you use any GAI tools, you must clearly state at the beginning of your submission which tool(s) you used and for what purpose.
1 Questions
1. This case study concerns a report from The Washington Post that Google’s algorithm
displayed prestigious job advertisements more often to men than to women (see below).
Answer the following questions:
• (10 %) What might be the reasons for this bias? (Refer to course content: bias sources, ML pipeline, etc.)
• (10 %) What are possible positive or negative consequences?
2. (10 %) How do you assess the consequences of female/male gender stereotypes manifested in current word embedding technologies? (See the page Man is to Computer Programmer as Woman is to Homemaker)
3. (20 %) What do you think: Should justice be blind and impartial, or does justice mean creating an advantage for those who are already disadvantaged (cf. Coeckelbergh, M.
(2020). AI Ethics. MIT Press. Chapter 9)?
4. Concerning the veil of ignorance:
• (10 %) Which idea—Harsanyi’s or Rawls’s—do you think better captures the notion of justice? Why?
• (15 %) Do you think this helps establish the notion of justice, at least for the cases presented this week? Use one example to explain why or why not.
5. (25 %) How do you assess the reasonableness of this idea?
One should use a dataset that mirrors the real world. The data may represent prejudices in society, and the algorithm may model existing biases people have, but this is not a problem developers should be worried about.