• Los Angeles-based writer, marketer, and content creator.
• Alumnus of the 2014 Warner Bros. Television Writers' Workshop.
• Former business owner, inventor, and creative director.
• Broad longtermist, but not certain about anything...
Thanks for your explanations!
Re: Questions
Apologies…I mean the questions your team decides upon during your research and interview processes (not the initial prompt/project question). As generalist, do you ever work with domain experts to help frame the questions (not just get answers)?
Re: Audit tools
I realize that tools might have sounded like software or something, but I’m thinking more of frameworks that can help to weed out potential biases in data sets (ex. algorithm bias, clustering illusion, etc.), studies (ex., publication bias, parachute science, etc.), and individuals (ex. cognitive bias(es), appeal to authority, etc.). I’m not suggesting you encounter these specific biases with your research, but I imagine there are known (and unknown) biases you have to check for and assess.
Re: Possible approach for less bias
Again, I’m not a professional researcher, so I don’t want to assume I have anything novel to add here. That said, when I read about research and/or macro analysis, I see a lot of emphasis on things like selection and study design — but not as much on the curation or review teams i.e. who decides?
My intuition tells me that — along with study designs — curation and review are particularly important to weeding out bias. (The merry-go-round water pump story in Doing Good Better comes to mind.) You mentioned sometimes interviewing differing or opposing views, but I imagine these are inside the research itself and are usually with other academics or recognized domain experts (please correct me if I'm wrong).
So, in the case of say, a project by an org from the Global North that would lead to action/policy/capital allocation in/for the Global South, it would seem that local experts should also have a “seat at the table” — not just in providing data — but in curating/reviewing/concluding as well.
With this post almost a year old now, I was curious if any of the commenters who were interested in switching to EA-related work have pursued this route. If so:
Thanks for sharing. I'm not a professional researcher, but spend a fair bit of time researching personal projects, areas of interest, etc., and enjoy learning about different exploration frameworks and processes. As a generalist myself, it can sometime be difficult to know if you're adding signal or noise to a picture you've yet to fully envisage -- particularly where a high-level of outside domain or technical knowledge is necessary.
In my experience, beneficial answers are often the result of pinging the right sources with the right queries. This alone can be a difficult chain to establish, but there's a deeper layer that strikes me as paradoxical: In most cases: the person/team/org seeking knowledge is also the arbiter of information. So...
Thanks for the referral. Interesting post -- even if much of the technical-speak is lost on me. What I gathered is that nobody really knows if/when software engineering will become an unskilled job (no surprise) but, a) many are confident that it won't be anytime soon (at least, for the discipline as a whole), and b) junior developers are the ones that LLMs are likely to replace (est. 1-3 yrs.).
While much of the thread's early sentiments echo replies here, there's a divergence concerning newer engineers as the conversation continues. It's these bearish predictions that worry me. I don't need to make six figures, but I can't invest time (6-12 mo.) and money (courses, bootcamp, etc.) in a career path where newbie "escape velocity" is unlikely. More to think about...
No, I've already made the decision to leave copywriting (unless an opportunity to have an incredible impact came my way).
Software engineering and data science were the two paths I was considering but engineering won out 1) As an end-to-end (idea to product) creation tool, and 2) Iit doesn't require me to first become proficient in probability/statistics . The latter is something I eventually hope to do but, financially, I can't afford to ramp up in math, then data science, then find a job. And while it's estimated that data science roles will grow at a faster rate than jobs in software engineering, there are far less overall spots available in data science . Being at the midpoint of my career, my ability to make a meaningful contribution somewhere as a software developer seems more likely than as a data scientist. Lastly, I'd assume data science would be the type of skill that AI will replace before software engineering (but that's a huge guess).
Thanks for that perspective. Given that I don't have experience in the programming space, I couldn't project a timeline between fully automated software production and AGI -- but your estimate puts something on the map for me. It is disconcerting though, as there are many different assumptions and perspectives about AGI, and a lot of uncertainty. But I also understand that certainty isn't something I should expect on any topic -- let alone this one . Moreover, career inaction isn't an option I can afford, so I'll likely be barreling down the software dev path very soon.
I'd say marketing is business-critical, and the difference between phone-it-in, good, great, and stellar content is important to bottom lines (depending on industry/product/service). That said, if the general point is that grammar issues on a site will have a lesser negative effect than buggy code that crashes that site, I agree. I'd also agree that unless you're a marketing or content agency, marketing and content may be part of your business but they're not the core of it. In contrast, almost every business in every industry runs on software today...
Still, I don't know how long things like scale, complexity, and strategy will be meaningful hurdles for LLMs and other AI technology (nobody does), but it feels like we're accelerating toward an end point. Regardless, software engineering seems like a good aptitude to add to the toolbox, and it's good to hear that I may not be too late to the game.
When it comes to refining AI generated code, do you imagine this being done within organizations by the same amount of programmers or that LLMs could be managed by fewer senior (or even lower) level engineers? This question is inspired by my observations in marketing, where the stock of full-time writers appears to be going down. I totally get that LLMs can’t create their own prompts, debug every line of code, or approve products, but do you think they’ll start allowing orgs to complete product development cycles with less engineers?
Great point that coding isn’t an end in itself. In addition to seeming fun/interesting, I'm looking to learn this skill for greater domain range, technical building ability, and professional autonomy. Knowing how to code could eventually help me launch a startup or support an EA-related org. And yeah, earning to give while I ramp makes this path even more attractive. Many great points and thanks for the encouragement!
Very interesting point. I hadn’t seen this as super plausible given how AI is starting to be used in copywriting/marketing: 1) Copy editors can now give prompts to LLMs and refine from there. 2) Non-writing workers e.g. marketing coordinators, account managers, etc. can use LLMs to create “good enough” pieces for landing pages, social captions, SEO, etc. This kind of AI integration seems to be eliminating the need for copywriters, content writers, brand writers, etc. But I should acknowledge that a lot of my worries are based on anecdotal evidence. I was the only full-time writer at my previous agency and, while I left on my own accord, it looks like they're going to experiment without the position. I think their plan is to get non-writing account managers proficient with an LLM and contract with a lower level writer for client edits.
According to BLS, writers and authors (very broad category) are expected to grow at 4% over the next 10 years, while editor roles are expected to decline by 5%. I do imagine that copy directors, technical writers, and script writers (various levels) will be among those spared near future replacement, but these are very specific niches, and the ability for LLMs to craft slogans, taglines, and scripts is getting quite impressive...
Now, I understand content creation is quite different from software engineering, and perhaps the former positions and tasks don't map well onto the latter. To your point, maybe the transformation in software is more analogous to physical engineering, where a newer professional who knows SOLIDWORKS, Fusion 360, FDM/3D, etc. is going to add value where someone more experienced who only works with legacy programs and traditional manufacturing can't. Does that comparison feel appropriate?
Was this recorded in any way? Would love to see/hear the talk if possible.