2. He mentions that the reason why the problem is occurring now, in the 21rst century, is because we have entered a new wave of technology: robots, engineered organisms and nanotechonlogy. And that these have the dangerous power to self-replicate
3. He offers a coupld of solution to this problem: "erect a series of shields to defend against each of the dangerous technologies," "move beyond Earth as quickly as possible"
4. "If we could agree, as a species, what we wanted, where we were headed ,and why, then we would make our future much less dangerous -- then we might understand what we can and should relinquish." We should come at peace with nature and stop our thirst for creation. Scientists must "adopt a strong code of ethical conduct": we must make sure that we don't create machines that have the capacities of mass destruction.
Final point: "common sense says there is a limit to our material needs and that certain knowledge is too dangerous and best foregone."
The piece by Huxley is an illustration of such a hypothesis. It's a satire of what the future would look like if we followed a utopic path of social stability, in a world ruled by machines. In the Brave New World, humans aren't brought up as being part of a family, children are raised communally and information in injected in their brains in their sleep. Therefore, their mind, "the mind that judges and desires and decides" is made up of what the controllers say.
I thought it was a great passage. It says alot about freedom and discovery. And relates very well to the article by Bill Joy.
How do you all want to go about discussing these articles?
No comments:
Post a Comment