December 15th 2018


  Buy Issue 3035
Qty:

Articles from this issue:

COVER STORY The Christ child: a life lived for the whole world

WATER RESOURCES Murray-Darling management delivers the worst of both worlds

CANBERRA OBSERVED Libs fish around for explanations

ASIAN AFFAIRS Taiwanese agree to stick with nuclear power

EDUCATION In support of NAPLAN

VICTORIAN ELECTION Coalition collapse

ECONOMICS AND SOCIETY Mondragon Corporation: humanity at work

BREXIT December 12: D-Day for Britain's EU vote

EUTHANASIA WA Government ignores objections and lessons

TAIWAN Referendum stems homosexual tide

INTERNATIONAL AFFAIRS Free trade and the WTO in the Trump era

MUSIC Teacher teachers: The jarring note in music courses

CLASSIC CINEMA The Adventures of Robin Hood: The one and only

BOOK REVIEW A triumph of determination

BOOK REVIEW An escape from futility and addiction

POETRY

LETTERS

HIGHER EDUCATION Massification: it's the name of the game

Books promotion page

SUPERINTELLIGENCE:
Paths, Dangers, Strategies

Nick Bostrom

$56.95


Buy Book
Qty:

by Nick Bostrom

(Oxford University Press, 2014)
Hardcover 352 pages
ISBN: 9780199678112
Price: AUD$56.95

 

Book description

• Original material based on new research.

• Written by one of the leaders in the field.

• Novel concepts and terminology will be explained making it suitable for the general reader.

The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. Other animals have stronger muscles or sharper claws, but we have cleverer brains.

If machine brains one day come to surpass human brains in general intelligence, then this new superintelligence could become very powerful. As the fate of the gorillas now depends more on us humans than on the gorillas themselves, so the fate of our species then would come to depend on the actions of the machine superintelligence.

But we have one advantage: we get to make the first move. Will it be possible to construct a seed AI or otherwise to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation?

To get closer to an answer to this question, we must make our way through a fascinating landscape of topics and considerations. Read the book and learn about oracles, genies, singletons; about boxing methods, tripwires, and mind crime; about humanity’s cosmic endowment and differential technological development; indirect normativity, instrumental convergence, whole brain emulation and technology couplings; Malthusian economics and dystopian evolution; artificial intelligence, and biological cognitive enhancement, and collective intelligence.

This profoundly ambitious and original book picks its way carefully through a vast tract of forbiddingly difficult intellectual terrain. Yet the writing is so lucid that it somehow makes it all seem easy. After an utterly engrossing journey that takes us to the frontiers of thinking about the human condition and the future of intelligent life, we find in Nick Bostrom’s work nothing less than a reconceptualisation of the essential task of our time.

Readership: General readers as well as academics in the fields of artificial intelligence (AI) and machine learning, computer science and philosophy.

 

About the author

Nick Bostrom is professor in the faculty of philosophy at Oxford University and founding director of the Future of Humanity Institute and of the Programme on the Impacts of Future Technology within the Oxford Martin School. He is the author of some 200 publications, including Anthropic Bias (Routledge, 2002), Global Catastrophic Risks (ed., OUP, 2008), and Human Enhancement (ed., OUP, 2009). He previously taught at Yale, and he was a Postdoctoral Fellow of the British Academy. Bostrom has a background in physics, computational neuroscience and mathematical logic as well as philosophy.

 

What the critics say

“Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era.” — Stuart Russell, Professor of Computer Science, University of California, Berkley.

“Those disposed to dismiss an ‘AI takeover’ as science fiction may think again after reading this original and well-argued book.” — Martin Rees, Past President, Royal Society.

“A magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs ... and by physicists who think there is no point to philosophy.” — Brian Clegg, Popular Science.

“There is no doubting the force of [Bostrom’s] arguments ...the problem is a research challenge worthy of the next generation’s best mathematical talent. Human civilisation is at stake.” — Clive Cookson, Financial Times.

“This superb analysis by one of the worlds clearest thinkers tackles one of humanity’s greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn’t become the last?” — Professor Max Tegmark, MIT. 


Related Articles:



























All you need to know about
the wider impact of transgenderism on society.
TRANSGENDER: one shade of grey, 353pp, $39.99


Join email list

Join e-newsletter list


Your cart has 0 items



Subscribe to NewsWeekly

Research Papers



Trending articles

VICTORIAN ELECTION Coalition collapse in Victoria

COVER STORY Will Morrison and Shorten remove freedoms from faith-based schools?

NATIONAL AFFAIRS Immigrants caught in English-language nether world

CANBERRA OBSERVED China's pushiness provokes pushback among neighbours

FOREIGN AFFAIRS U.S. midterm elections leave Trump in charge

VICTORIAN ELECTION The left gets ready to scream 'haters!'

COVER STORY An election-winning policy: a development bank for Australia



























© Copyright NewsWeekly.com.au 2017
Last Modified:
June 20, 2015, 1:01 pm