Hey everyone! My name is Jacob Haimes, and I host the Into AI Safety podcast. At this point I have released 15 episodes of various length, and if you're interested you can check out my initial motivation in a post I made to LessWrong.

The important part, though, is that this week's episode of the podcast is an interview with Dr. Peter Park. Along with Harry Luk and one other cofounder, he started StakeOut.AI, a non-profit with the goal of making AI go well, for humans. 

Unfortunately, due to funding pressures, the organization recently had to dissolve, but the founders continue to contribute positively towards society in their respective roles.

Nonetheless, the interview gives great coverage of some of the first struggles and accomplishments that have happened since "AI" hit the main stream.

Note that the interview will be broken up into 3 episodes, and this one is only the first in the series. Additionally, note that this interview was made possible through the 2024 Winter AI Safety Camp.

As I have mentioned previously, any feedback, advice, comments, etc. is greatly appreciated.

Spotify
Apple Podcasts
Amazon Music

21

2
0
2

Reactions

2
0
2
Comments7
Sorted by Click to highlight new comments since:

Thanks for this episode and flagging it with a Forum post. Love the little sound that alerts me to links.

This is a really interesting podcast - particularly the section with the discussion on foundation models and cost analysis. You mention a difficulty on exploring this. If you ever want to explore it, I'm happy to give some insight via inbox because I've done a bit of work in industry in this area that I can share.

Hi CAISID, thanks for letting me know that you think the podcast is interesting!

(For context to others, the original comment is referring to an aside that I give at 27:04 about "foundation models")

I definitely did have a difficult time finding concrete answers regarding training costs of current cutting edge foundation models. The only numbers I could find from the big companies that were primary sources, i.e. from people within the company/press releases, were very loose numbers that included salaries, which basically makes them meaningless (at least, for the information I want out of them).

There has been some work done on estimating training costs (e.g., Epoch or the AI Index Report), but it seemed that I would need to spend a significant amount of time collecting data and even doing some forecasting to actually get approximations for current state of the art.

Would love to hear your thoughts on this either here or whatever messaging format you prefer.

Good initiative! And the website looks great! It would be nice if there was a way to speed up the audio. 

Hey Chris, thanks for the comment!

I am slightly confused about it though, as I am able to control the playback speed of the player on my website (at intervals of .25 from .25x to 2x). Additionally, I know that on Spotify the episode can be played back at up to 3.5x.

If anyone is still out there thinking: "I really can't stand people's voices unless they are sped up by more than 3.5x," you can also download the mp3 file directly from my podcast website.

Let me know if I am missing something here, as I want to make sure my content is maximally accessible to everyone :)

I saw the option now, it was hidden behind the three dots. Thanks! :) 

Of course, glad I could help!

Curated and popular this week
Relevant opportunities