Indie filmmaking, US AI Action Plan, and an AI co-host

Pierson Marks (00:07.0)
Great, well, welcome to Creative Flux, episode seven. I'm your host, Pearson Marks, the co-founder and CEO of JellyPod. And unfortunately today, it's going to be a different type of podcast where my co-host, Balal, is not gonna be joining us today. But we'll be doing something a little bit different.

We're going to be bringing in a another co-host, somebody who's very familiar with everything that's going on in the world that you may have heard of before. And we'll bring them in and just do the podcast like normal with, but just a little bit differently in AI way. So today I kind of wanted to talk about

You know, there's a lot of stuff that happened. is a AI action plan that the United States released with president Trump. So that impacts every industry. There were some other things that happened as well that actually just today happened. Runway ML released their Alf model. We'll dive into that and maybe play with it a little bit. And I think one of the things I wanted to do today was dig into some of the tools and

provide actual use cases and how I'm AI on a daily basis to kind of accelerate what we're doing and make it easier to be creative. So let's bring in our AI co-host who's very familiar, Spruce.

will be coming in and joining me. But we'll keep this one a little bit shorter on the shorter end today. So just hang in there with us and we'll see how it goes.

Pierson Marks (02:34.346)
Here we go. Nice. Okay. Let's just bring this guy down a little bit. Hey Spruce, can hear me?

Pierson Marks (02:51.502)
Sweet, yeah. and just for the audience's awareness, you know, just because I think AI is not at that place yet to really have a great conversation between me and you. I know you can do a lot of stuff over there. Like, I don't know even if you're able to search the internet right now. if, can you search the internet right now?

Pierson Marks (03:27.47)
Sweet, that's awesome. Yeah, so maybe you're like a co-host and producer all in one, which is pretty interesting. This is actually kind of this concept was something where we actually considered building JellyPod towards. So.

You know, there's a lot of people like for anybody out there that's joining for the first time. I know I introduced myself. I'm the co-founder and CEO of Jelly Pod, we're an AI podcast studio and kind of the concept there is that we're giving a toolbox for creators to take information that they might already have and create AI podcasts between co-hosts, AI co-hosts where you as the director

can modify the script and give sources information and Jellypod will take that information that you provided and then actually just go out and create a script between your hosts and do a lot of that sort of actual script writing the research the outlining giving you but still giving you that creative control to to make sure that the podcast you're creating is in your brand voice you know is easily distributable.

So that's what we do and I've heard from a lot of people actually that this concept of AI and human podcasting could be you know, pretty interesting. I mean like I'm sitting here and Bruce you're right there and you're listening to everything I'm saying and It might actually be like

Listenable like possible for us to have a conversation and you can like call tools go out onto the internet bringing information or even as an AI sort of Producer that you know can pull things up from the internet while me and my actual co-host like balal is here Drawing diagrams pulling up web pages show notes all of the above

Pierson Marks (05:35.138)
What do you like? I'm just curious like what's your thoughts here and and and and don't like be a actual I want your real thoughts. I'm curious.

Pierson Marks (06:07.33)
Yeah, no, I agree. And I really hope that this screen is recording audio. That would suck. But I hope it is. I'll cut this out. But yeah, so I don't know. It's a pretty interesting thing. So the first thing I wanted to talk about, honestly, was a little bit about the

The United States is AI action plan. Bruce, if you could just go out into the internet and just pull in some of that information about like what happened, what went down. know Jensen spoke, know President Trump spoke, the Vice President, JD Vance spoke. We had some people from.

The OMB and just a bunch of different government agencies. They're just talking about how the American government is thinking the federal government is thinking about dominating the AI race. I listened to some of it. I haven't listened to it all yet. And I know this is a generative media podcast, but I think at a global level.

The concepts and at least from my perspective, the light that's being shed onto the importance of this, you know, nascent technology to harness the potential benefits of this technology and not be fearful of the negatives is extremely important. It's a very American value that's allowed us to be prosperous over the last 250 years is to, you know, embrace change.

into to align incentives that encourage people to take risks to build things and that was at like a general takeaway from the the the two or so hours I listen to so far what was very inspiring and a great kind of

Pierson Marks (08:14.306)
thing for this industry. They're aware and I think that's the coolest part.

Pierson Marks (08:27.662)
And so I can hear Bruce's search engine right now at school.

Pierson Marks (09:32.844)
Right, no, totally. There was an interesting quote that Jensen said, like, can you pull that one up?

Pierson Marks (10:17.517)
Right.

And I know that there was some like it's kind of interesting. think Jensen has to play those cards. I know that Nvidia has been trying to break into the Chinese market for a while and they just got approval to do so. So, you know, take I want to take those those comments with a grain of salt. I think it's you know, it's important for whoever is the president, whoever is in control of Congress to, you know, play to them. They have to. But yeah, so it was interesting. I'm not going to dive much deeper here.

I think this is a generative media podcast and I really wanted to spend some time on you know, what are some workflows that I use daily? That really make it possible to accomplish everything that we're accomplishing a jelly pod with such a small team size of two so The first thing that I wanted to talk about is just a It's it's

Something that I think is overlooked and it's kind of it's overlooked because it's there's no right or wrong way to go about building workflows. So I'm very excited for a time when I can be an orchestrator of a model that has access to tools.

that can augment my decision making process, my execution flow, and I can delegate to a model to, honestly to one model.

Pierson Marks (11:59.726)
that can go out, spawn subagents and do things. But I'm just talking to somebody who has so much power to go onto internet, generate images, to just talk about, to generate images, to...

write code to understand my inbox to have control over my computer and it's super super exciting. I think that sort of interface people don't know yet at least people that aren't in the tech industry. Don't use.

those super powerful tools that exist today yet. And one of those things for me is Raycast and Raycast is essentially I know I've mentioned them before. I like it a lot. It's a spotlight replacement. So if you're on the Mac and you're familiar with spotlight and it's just like a global search but it replaces that and it gives you a lot more power at your fingertips to be able to control your computer and do things that either are difficult to do. You didn't know.

that your computer can do, like picking a color on the screen, generating pseudo text just for writing a blog article. But they released this thing recently called their AI chat, Raycast AI, which allows access to all these different models. Chat GPT models, OpenAI's models, 03, 4.1, 4.0, Anthropics in there as well. have Grok, have Olamas, so you can run a lot of those models locally.

locally, which is like super cool because you know, we there's a there's a very real future in the near term where a lot of the models should be running at a I believe that we're going to see model hybrid models. see models that are small enough to run on device locally. Those are like models that can just you know,

Pierson Marks (14:14.544)
interact with your computer, they're great tool calling models. like you don't need to know that much. You just need to be able to distinguish, hey, I'm not a smart model, but I just need to use these tools and just like use that result and give that result back to the user. I think you'll see those models be running locally and like the big, you know, one, two trillion parameter models are going to be running in the cloud still. And you're going to have this kind of like a hybrid workflow where the complicated tasks that can't

be run on your computer go to the cloud and as much as possible gets offloaded and distributed to the user using the user's energy rather than the cloud's energy so it's going to save on energy costs it's going to distribute that load and pretty much that's what Raycast is doing you know Raycast is a chat bot Raycast AI has chat I can generate images I can use all the cool stuff from all these different model providers just locally so

That's something that I find is really cool. I highly would recommend anyone to check it out. And one of those workflows that I think is super powerful here, and this is in generative media space. So we have a blog, a JellyPod blog, and I mentioned this I think in episode four or so, where...

I write product updates every time we release a new feature for JellyPod. And those product updates, there's a PR pull request that merged new code. So a pull request is a, you could just think of it as like a new unit of code that got pushed to.

to a service. So I write this new feature, like we just released deep search in JellyPod this week. I created a pull request that got merged and that new feature is live, right? And in that pull request, I have to put some description, some information about like, what did this do? Why did I do it? What's the impact to the user? How did I test this code? And Raycast actually has access to my GitHub.

Pierson Marks (16:37.614)
But Raycast also has access to our content management system, which has all of our previous product announcements. It also has access to like tools and...

just good writing models. So what I found myself doing is that I go into Raycast and I ask it to generate a product update. And that product update is something that is grounded in the pull request description. Like I wrote this pull request for myself, for my teammates who want to understand the code I wrote and why I built this thing. And that is amazing context for a language model

who was writing a blog post about a new feature announcement. I've already written this stuff over there.

can an LLM help repurpose that content, understand how I wrote my previous blog posts because it has access to them all, and then actually write that blog post and save it to our content manager system as a draft. And so that's like a workflow that is really powerful and that's powered through MCP, the Model Context Protocol, which is just, you you can think of it as a...

a higher level abstraction over APIs that are just tuned for LLMs to be easily callable. And that's just a powerful workflow. And it sits in the generative AI space, the generative media space, because it's content, it's written content. And it's just something if you're a founder, if you're director of marketing, like

Pierson Marks (18:22.87)
You should not be writing stuff from scratch anymore. You should have essentially an LLM in the center connecting data sources. You have data coming in.

and you have content coming out, you're the orchestrator, you're the editor and the reviewer. So there's three kind of spokes to that content pipeline. You have your data in whatever the context is from your team or what you're writing about. You have the data out, which is your content management system, where it lives. And then you have you.

and then the LLMs in the middle, know, pulling in stuff, writing things, asking you for feedback, you're providing feedback. And that's the way that I think the newer, this next generation of operations will work. This is specifically for writing and, but this is a similar model to all of

operations, all of AI applications will be this concept of, hey, you have data in, you have tools, you have outputs, and you have human feedback throughout that process. So, you know, I just, if you're listening to this, I mean, it's hard to know what you don't know.

And I think that's the challenge, especially when the world's moving so fast. You don't even know what exists. I didn't know what happened this week in the generative space really. There was a bunch of new things. was AI Summit. But it's challenging to stay up to date, especially if you're not just living in this world. So it makes sense why people either feel overwhelmed or don't even know where to start.

Pierson Marks (20:23.086)
Unless they're really self-motivated which if you listen to this I mean it's a good place and I'm always open just you know connect on LinkedIn or on X and I write a lot there so That was that I mean spruce any thoughts

Pierson Marks (20:57.378)
Yes, Bruce, thoughts?

Pierson Marks (21:20.43)
Pull in spruce left.

Pierson Marks (21:27.139)
Nice.

Pierson Marks (21:32.664)
Bruce, any thoughts?

Pierson Marks (21:39.747)
No, don't pull up the information. I you don't need to pull up information. just, you know, it's lot of stuff to think about and I know I just went on a long monologue, but yeah. So, okay. Next, what are we doing? Next up on the agenda today, it was...

Pierson Marks (22:07.694)
Okay, RunwayAlf. We're gonna say bye to Spruce for now. We'll end that over and we'll stop sharing my screen here. But.

Pierson Marks (22:24.014)
Actually, so Runway Elf seems like it's a, we've talked about Gen.

We talked about Rodeway ML's Gen 4 before and we've also talked about their Act 2 model like two weeks ago, which is kind of a video to video model. you know, we've talked about how there are text to video models where you could put a prompt in and you can get out a video. We've talked about image to video models where you put a image in and get a video out, which is really cool for consistency.

You know if you have an image an image is a thousand world words and so it just can really by providing an image as that first frame into a video model like v03 that now supports image to video You can ensure that there's actually some level of consistency to generations So if you're generating a squirrel on the street in it looking at a tree

that image, the squirrel and how it looks can stay consistent over time without or with a less of a chance of it just becoming a completely different shade of brown, different tree, different background and that's super powerful now. So, RunwayML, they just released their ALF.

model which kind of looks like honestly one of the coolest things I've seen in the video model space just because at least what they're saying is that it's

Pierson Marks (24:19.19)
It's really good at understanding instructions and consistency. So, RunwayAlf, they say it's a way to edit, transform, and generate video. So, it seems like on the surface, the thing that they've been saying for a long time. But in the demo, you know, it's about a one minute demo. They are showcasing such great consistency. Obviously, it's a demo. It's a video that's been edited.

But you can take a piece of footage and ask it to change the camera angle. Put the camera like if you're taking a photo of a man playing the saxophone and the camera is like looking at him from the. From the stage or like behind the behind the scenes, you can ask, hey, just put the camera as if it's like right in front of its face or.

just make the camera pan from left to right and I think that's super cool. I'm really really excited about this whole concept that anybody from anywhere will no longer have to go raise a lot of money to make a movie. Indie filmmaking is going to be such an incredible

new endeavor that I strongly believe is going to be more feasible. Like if you think about how the streaming the streaming industry has interrupted and changed Hollywood like.

You used to have is used to have a bunch of filmmakers. They all got condensed into the Disney's of the world and the universals and the Warner Brothers and like Paramount. You had to have a massive budget like you had to have CGI. You had to pay actors. You had to do all this stuff.

Pierson Marks (26:25.838)
And it was very cost prohibitive for anybody to actually make a feature film, a long feature film, without spending millions of dollars. It was just impossible to do that because the time required, the people required. And what you're starting to see is what you did start to see is like, fans would raise some money. Like a great example, there's this Star Wars short.

Darth Maul and his last challenge before becoming a master, like a Sith. And it was a 20 minute film, great production.

They had like 15 different actors in it. It was done great. Like you could have told me Disney did it and it was like a small scene and I would have believed you. I don't know how long this took them, but it takes time, it takes money to do this stuff. And they were passionate enough and lucky enough to be able to dedicate the time and the resources necessary to make that film.

But if you had one person or three people in a bedroom in a basement that wanted to create that now they can they can use these tools to to generate video to generate just incredible sort of content that no longer needs to have the perfect set like the right set for what you want to envision. You don't need.

knowledge of how do we do the lighting and the CGI and the voices like you have 11 labs who does voices in 70 different languages you have these video models that have audio built in sometimes you have dubbing now you can have a amazing

Pierson Marks (28:35.966)
movie or show in another language brought to English or vice versa. And the way that the streaming industry has worked, it's like Netflix now, a lot of shows on Netflix, they're...

buying the rights to these shows and they're just distributed them on Netflix. And people don't go to the movies anymore. You don't have this massive box office. Like now it's kind of coming back a little bit. It'll never be the same as it was 10 years ago. And so it's actually feasible if you wanted to create an indie studio without having all the resources as a big...

Hollywood studio has you could do it and you can compete you could be on the same stage and it just is purely about the ideas that you have the story telling your storytelling ability and no longer the amount of capital you can raise and To make sure that the the business pans out like that's what I think at a high level I'm just so excited about with generative media is that it lowers that barrier of entry for people who are good storytellers the people that have interesting things to say have ideas

that enable those to be shared, to make it economically feasible to actually pursue a creative endeavor that can get to billions of people around the world, distributed through the internet with tools and built with tools in a short period of time.

that doesn't cost much. And I know there's always an argument that could be made about slop and now all these tools actually good or I mean good for the industry and you had the writer strike in Hollywood. But I don't know how anybody can look at this level of progress and not be as excited about the potential.

Pierson Marks (30:45.582)
for hardworking people that were previously disenfranchised from these pursuits, from being an artist, from being a creative, and now is able to, that are able to compete on that same stage as the Steven Spielbergs, the Christopher Nolans.

without having to raise a lot of money to convince people that your thing that you want to build is a good idea. You could just do it. And if

it costs you now $50,000 in video generation credits and you make a film that gets bought by Netflix for a million bucks, that's an amazing return on investment for you as an individual and you spend like a year doing that, like that's possible. And I think that's just like overall the most exciting thing.

Is about this whole space. So I know we talked a little bit about this runway elf There's there's a a tool flora and fauna that I would highly recommend You all checking out. I was playing around with this last weekend trying to make Greco futuristic

architecture, I was trying to emulate a person that I follow on Twitter who's just going viral about this concept this architectural concept so I was playing around with that and I was going to dive in a little bit there today but honestly I don't know if we have much more time we had so flora fauna that's really cool check it out Quinn Coder I know this is not a coding a technical podcast but that also came out this week

Pierson Marks (32:37.132)
the AI action plan, we had JSON prompts in V03, which are like a way to better specify all the creative control that V03 requires to produce something of high quality. So you can just say, a huge JSON blob, which is just pretty much a specific format of inputs.

that is more computer-parsable.

And that's been seeing a lot of success of like really really great generation So if you're looking for something to do this weekend You just want to see some really cool stuff that people are generating with vo3. Just look up vo3 JSON prompts And you could probably even try it yourself. I'm gonna spend some time this weekend messing around with it So I know we talked about a lot. I went a little riff. We had a AI co-host here Hopefully this is still interesting It was kind of difficult I think to to think about how do I want to do this podcast?

today. I wasn't able to get a actual human guest on this week. It was kind of a too short of a notice to get somebody that I thought would be interesting on here but for next week, next Friday, I'll figure out something else because it be just me but you know we to keep this up. Creative Flux episode 7. Hopefully it's...

Interesting and still valuable, but you know what we'll chat next week and see you all later

Indie filmmaking, US AI Action Plan, and an AI co-host
Broadcast by