Enlarge / It looks so peaceful up there.alxpin / Getty

Welcome back to Ars on your Lunch Break! Its been a while since weve done this, so Ill start with a brief orientation. This series is built around the After On Podcast—which itself is a series of deep-dive interviews with thinkers, founders, and (above all) scientists.

Often exceeding 90 minutes, After On episodes run longer than the average busy Ars readers lunch break. So we carve these unhurried conversations into three to four 30-ish minute segments, and run em here around lunch, Ars Daylight Time. You can access todays segment via our embedded audio player, or by reading the accompanying transcript (both of which are below).

Weve presented two seasons of these episodes so far and are planning a third one in the fall. As for this weeks run, its sort of a summer special. The impetus is a talk I gave at Aprils annual TED conference, which TED will debut on their sites front page tomorrow. I was asked to speak as a direct result of a two-part podcast interview I ran in late March. Some quick cocktail napkin math may tell you this gave me about 10 days to prepare my talk.

Its not every day that I give a main-stage TED talk (its once every 2,603 days, based on precisely two data points). So were marking it with a one-off serialization of the After On interview that triggered it. Segments will run daily through Thursday, and tomorrow well also embed a video of the TED talk (and the next extended series of these interviews will come to Ars in the fall).

Lets talk about death, baby

This weeks guest is Naval Ravikant. Ravikant is a renowned angel investor and entrepreneur who conjoined these callings by founding AngelList in 2010. AngelList is now a fundraising juggernaut, and almost 30% of significant US tech startups have raised at least some money through the platform in recent years.

But our topic this week is something quite a bit darker than entrepreneurial finance. Specifically, its existential risk. This refers to a set of dangers which might, in a worst-case scenario, imperil humanitys very existence. Many of these dangers could be enabled by near-term scientific and technological developments. Naval and I will particularly focus on risks connected to synthetic biology. This is ironic, because as regular After On listeners know, Im a hopeless synbio fanboy. Naval and I will also touch more briefly on certain risks connected to superintelligence research.

Click here for a transcript and click here for an MP3 direct download.

Unusually for my show, this is more of a conversation than an interview—because neither of us are what youd call existential risk professionals. Rather, weve both thought, read, and spoken aRead More – Source

[contf] [contfnew]

Ars Technica

[contfnewc] [contfnewc]