🔴LIVE – My AI Coding Workflow has 10x’d Again with Arch… — Transcript

Live demo of Cole Medin's AI coding workflow using Archon, showcasing 10x+ productivity improvements and real-world development at scale.

Key Takeaways

  • Archon significantly accelerates AI-assisted coding workflows, enabling rapid development and multitasking.
  • Using issues as inputs allows structured and scalable coding agent workflows.
  • A workflow marketplace will facilitate sharing and installing custom Archon workflows easily.
  • Combining AI coding with human validation ensures deterministic and reliable software development.
  • Live demonstrations on real projects provide valuable insights into practical AI coding applications.

Summary

  • Cole Medin demonstrates his AI coding workflow live, highlighting how Archon has increased his productivity by 10x to 100x.
  • The stream covers both brownfield (existing codebase) and greenfield (new features) development using Archon.
  • Archon enables working on multiple tasks in parallel by using issues as inputs to the coding agent.
  • A key focus is on building a workflow marketplace for Archon, similar to skills.sh, to share and host workflows.
  • The workflow marketplace concept involves bundling workflows as assets hosted in repositories, installable like NPM packages.
  • Archon emphasizes deterministic AI coding by mixing human validation and AI assistance to maintain quality.
  • The livestream includes real-time coding on Archon itself, showcasing its use on a complex codebase.
  • Cole shares insights into his planning process using Claude for session preparation and workflow orchestration.
  • The stream aims to provide a realistic view of AI-assisted coding beyond demos, focusing on practical, scalable development.
  • Community interaction and feedback are integrated, with viewers sharing their experiences using Archon’s PR review features.

Full Transcript — Download SRT & Markdown

00:00
Speaker A
Welcome everyone to the livestream where I show you my up-to-date AI coding workflow and how Archon has done yet another 10x over my AI coding process.
00:22
Speaker A
Code brought me to a 10x. And now Archon, it honestly feels like a 100x.
00:26
Speaker A
So I would say that my classic development before AI coding assistance was a thing, that was a 1x.
00:40
Speaker A
show you guys today. I'll give you really like an inside scoop. Like I'm not just gonna be working on some kind of demo that I've set up for today.
00:48
Speaker A
And then starting to introduce tools like Cloud Code brought me to a 10x.
00:58
Speaker A
you what it looks like to do real AI coding at scale with Archon, not just some dinky app that I built. So I've got a pretty comprehensive plan for today. Let me go ahead and pop this up here. So I have a
01:15
Speaker A
And now Archon, it honestly feels like a 100x.
01:31
Speaker A
I want to start by showing what it looks like to do brownfield development, because I don't think that's really covered enough. I want to show how I can use issues as the input to my coding agent to work on different things. And then
01:45
Speaker A
And maybe that's a little bit of exaggeration just because it's the clean 1 to 10 to 1, but it's pretty crazy how fast I'm able to build things and work on different things in parallel thanks to Archon.
01:58
Speaker A
to work on for Archon, and I'll explain what Archon is and give my elevator pitch in just a couple of minutes here for those of you who don't know what it is. But one thing I want to do for Archon is build a
02:08
Speaker A
So that's what I want to show you guys today.
02:24
Speaker A
a skills registry, like what Vercel has with skills.sh, but for Archon workflows. So this is Greenfield sort of in the sense that this is a brand new part of the Archon ecosystem. We don't have anything like a workflow marketplace yet.
02:39
Speaker A
I'll give you really like an inside scoop.
02:51
Speaker A
brownfield, a bunch of greenfield. And then to finish things off here, there's one other thing. Maybe we'll have time for this, I don't really know. know, but doing some work on the Dark Factory as well. So there's quite a few live streams that
03:05
Speaker A
Like I'm not just gonna be working on some kind of demo that I've set up for today.
03:21
Speaker A
if I have time to bring that back today. especially because as we are invoking Archon workflows to handle everything else above, we'll probably have some down time. So I might end up just spending all that time and chatting with you guys, because I
03:35
Speaker A
I'm going to be using Archon to, well, it's a little meta, but I'm gonna be using Archon to work on Archon because it is just the best example of a really complicated code base that I'm super wrapped up in.
03:47
Speaker A
get into things, I want to chat with you guys for a little bit. Appreciate all of you being here. I know that hopefully there's a little bit more people in this live stream compared to the usual because I talked about this one in
03:59
Speaker A
So I can show you what it looks like to do real AI coding at scale with Archon, not just some dinky app that I built.
04:10
Speaker A
these streams. But I just want to show what it looks like to build build at scale, build for real, not just show the perfect little demonstrations and YouTube videos and talk about the newest features of Claude, because everyone's doing that right now, but
04:25
Speaker A
So I've got a pretty comprehensive plan for today.
04:36
Speaker A
that's okay, because it's not the point to understand the little individual pieces of work I'm doing, but more just to see my high-level process for AI coding and how I'm using Archon, and just generally how I'm incorporating skills and context for
04:49
Speaker A
Let me go ahead and pop this up here.
05:01
Speaker A
being able to get a glimpse into how I work on a project for real means that I can't really set the topic too much ahead of time. But I definitely did a lot of preparation for this one to make sure that I have
05:13
Speaker A
So I have a file in my Obsidian Vault where I've pretty much just had a big session with Claude planning on everything that I do in the stream today.
05:28
Speaker A
You guys could hear the voice echo. Ah, man, I didn't realize I didn't have the stream muted. That is my bad. Well, it's fixed now. Okay, shoot. That's a bummer. All right. So I have it muted now, so hopefully it's
05:43
Speaker A
I want to cover and showcase really like different parts of my AI coding workflow with Archon.
06:08
Speaker A
submit a workflow here, it's going to be a bundled asset for everything that the Archon workflow needs. So it'll be the core YAML file for the Archon workflow.
06:19
Speaker A
I want to start by showing what it looks like to do brownfield development, because I don't think that's really covered enough.
06:36
Speaker A
have whatever you host, basically what I'm thinking about doing for the workflows here is you build the bundle for your workflow and you host it in your own repository, kind of like an NPM package. You know, like if you have any kind of
06:50
Speaker A
I want to show how I can use issues as the input to my coding agent to work on different things.
07:03
Speaker A
Archon workflow. That's the plan for at least the proof of concept that I'm going to be doing for the workflow marketplace. And I'm excited for this. This is a very, very requested feature because right now people, if they want to share workflows, they
07:18
Speaker A
And then I want to show Archon for more greenfield development.
07:31
Speaker A
to that in a little bit, but I want to start by giving a quick elevator pitch for Archon and then I'll show what it looks like to work on some GitHub issues because I know there's a big backlog of things. It's crazy how
07:43
Speaker A
I'll still be working on an existing code base, but we'll be building out a brand new feature.
07:47
Speaker A
It's because of how fast you can move with AI coding assistance and like, yeah, we can keep up decently well. I mean there's a ton of closed issues and merged pull requests already, but holy cow, the list just keeps growing. So we'll knock
07:59
Speaker A
And I'll actually show you a little bit of a teaser for that.
08:06
Speaker A
Just honestly want to thank you for giving us Archon. Literally don't ship any PR without my custom PR reviews from the built-in ones. Very cool. And yes, you're very, very welcome. And I love that you built your own pull request review
08:19
Speaker A
One thing that I want to work on for Archon, and I'll explain what Archon is and give my elevator pitch in just a couple of minutes here for those of you who don't know what it is.
08:36
Speaker A
single live stream, but I want to do that really quick for those of you who haven't used Archon before. And then even if you are already familiar with the tool, I think it's kind of cool just to see how my elevator pitch is
08:48
Speaker A
But one thing I want to do for Archon is build a workflow marketplace.
08:59
Speaker A
recap of what I'm doing, just so that if you weren't tuning in from the beginning, you still have a chance to catch the whole goal in the live stream every half hour. Anyway, the way that I've been framing Archon recently is
09:15
Speaker A
So a place for people to build their own custom coding agent harnesses with Archon, and then share them with other people.
09:33
Speaker A
LLMs into your system instead of the other way around. And that distinguishment is really powerful because a lot of what people are trying to do right now is they'll take their entire development process, like planning and implementing and validating, and they'll just shove
09:48
Speaker A
And you can install each of these with just a single command, kind of like an NPM registry or something like a skills registry, like what Vercel has with skills.sh, but for Archon workflows.
10:05
Speaker A
context in the wrong way, they'll forget to do validation. There are certain things that we want to enforce into our system. We want to use the coding agent when we need that reasoning capability, to actually write the code, for example, but we don't
10:17
Speaker A
So this is greenfield sort of in the sense that this is a brand new part of the Archon ecosystem.
10:33
Speaker A
workflows, making AI coding as deterministic as possible. Because there are certain steps where we don't actually need a large language model. Like maybe we do some implementation with Cloud Code or Codex or Pi or whatever, and then right after the implementation, we
10:47
Speaker A
We don't have anything like a workflow marketplace yet.
11:00
Speaker A
have human in the loop built in, you can mix providers, you can use Claude for implementation and Codex for validation. There's different parameters to manage how context is passed between different nodes. And so really the goal with Archon, and we're pretty much there
11:15
Speaker A
So what you're looking at right here is really just a POC that I did in a feature branch.
11:28
Speaker A
the things that I'll show you today is invoking in parallel to like really, really scale. So once you have a workflow that you trust and you kind of just have it as this like fire and forget task, like, okay, go handle this issue
11:40
Speaker A
I really am not gonna be leveraging this.
11:53
Speaker A
limits, but if I really wanted to, I could have it try to address 20 issues at the exact same time because Archon will create isolated code bases for each one of them. So all my coding agents aren't stepping on each other's toes as
12:05
Speaker A
I'm gonna try building it from scratch here.
12:19
Speaker A
Maybe you asked this before I just explained it. What's the difference between an Archon workflow and a Claude skill? And so this is like exactly the problem that I'm trying to address here is like, yes, you can have your entire process as a
12:31
Speaker A
So we're gonna be using Archon for a bunch of brownfield, a bunch of greenfield.
12:45
Speaker A
wrong order or just not quite what you wanted. So basically with Archon, I know it might sound a little counterintuitive, we're trying to take the decision away from the coding agent as much as possible. That's the idea of a harness is we have
12:59
Speaker A
And then to finish things off here, there's one other thing.
13:13
Speaker A
the coding agent to do the same validation every single time. When there's something that I need guaranteed, I actually want to take that away from the coding agent. So that's why we have this sort of interleaving here of some nodes that are relying
13:25
Speaker A
Maybe we'll have time for this, I don't really know.
13:40
Speaker A
And so I'll show you what that looks like right now. So I'm gonna pop back over to my Claude code here. And I'm actually working within my second brain right now. So this is another thing that I've been
13:52
Speaker A
But doing some work on the Dark Factory as well.
14:04
Speaker A
And so this is the diagram here. I'm not gonna like go into all of this, because that's not exactly the point of the live stream right now. But I have a system in place that essentially allows my second brain to
14:17
Speaker A
So there's quite a few live streams that I did within the last couple of weeks on my Dark Factory, and I went kind of quiet on that within the last week and a half just because I've been pretty busy with some other things going on, especially with Archon.
14:29
Speaker A
viral last month, being able to like have an index that then points to other documents to read, like for our case, if we're working on a specific code base. So if I tell my second brain, I want you to use Archon to
14:44
Speaker A
So we'll see if I have time to bring that back today.
14:58
Speaker A
like, here's a specific code base I want to work on, it'll know to read the full file. Like this is the file for my Dark Factory experiment that I've been doing live streams on recently. And so it'll read this file so that it
15:11
Speaker A
Especially because as we are invoking Archon workflows to handle everything else above, we'll probably have some down time.
15:26
Speaker A
specific code base. And then all of the custom workflows that I've built for this specific code base. Because without this, my second brain, it has the Archon skill. So it knows like generally how to use Archon, how to build workflows and run them.
15:40
Speaker A
So I might end up just spending all that time and chatting with you guys, because I always love going back to the chat pretty often when we're waiting for coding agents to finish, but we'll see if we have time for that.
15:53
Speaker A
take inspiration from them to build your own. And so for every single code base, you're gonna have custom workflows, because each code base has its own tech stack and architecture. Maybe you have a bit of a different process between each one of your
16:05
Speaker A
We'll see how long I stream for as well.
16:16
Speaker A
here. I have my workflow preferences, and then even a dispatch history. So this is a really cool part. So I have a sort of like daily log and reflection system built into my second brain so that whenever I invoke an Archon
16:31
Speaker A
Yeah, for everyone who's here right now, before I just get into things, I want to chat with you guys for a little bit.
16:45
Speaker A
sometimes when you say, you know, like use Archon to fix this issue, it might be like, okay, well, which workflow do you want me to use exactly? And so building up that dispatch history over time gets it to the point where it just
16:57
Speaker A
Appreciate all of you being here.
17:02
Speaker A
Let's just go ahead and use that. For this code base, it knows let's use this workflow when Cole wants to handle a GitHub issue, for example. So that's the goal of the system that I have set up here. It's pretty simple overall.
17:16
Speaker A
I know that hopefully there's a little bit more people in this live stream compared to the usual because I talked about this one in my Wednesday video.
17:30
Speaker A
all right, I want to use Archon to handle issues number 1607, 1610, and 1612 on the Archon code base. That is all I have to do.
17:44
Speaker A
But I'm excited for the new cadence that I have for live streams.
17:58
Speaker A
and handle this. And it'll know automatically to load the Archon skill and read the right files so it knows how I like to work on this specific code base. You can see right here that it's pulled from repositories Archon.md. And so
18:15
Speaker A
Doing these three times a week.
18:27
Speaker A
might not see it load the skill, but once it loads the Archon skill that I have, let me just scroll up, I have the Archon skill right here. This gives it full context for how to use the Archon CLI to invoke all these
18:42
Speaker A
And I'm just going to try this out and see what you guys think and what kind of attendance I really get for these streams.
18:58
Speaker A
brain, for example. So I just copied this skill into my second brain code base so that my Cloud Code there has instant access to it. So let's see. Yep, looks like it already had the skill loaded earlier in the conversation.
19:14
Speaker A
But I just want to show what it looks like to build at scale, build for real, not just show the perfect little demonstrations and YouTube videos and talk about the newest features of Claude, because everyone's doing that right now, but really show you what real engineering looks like.
19:29
Speaker A
as we're continuing to evolve the workflows for Archon. We created a work tree. with a branch for each one of them. And then there's a quick description here to give it context, kind of truncates the prompt here, but it gives that initial context
19:43
Speaker A
And so some of these things that we're going to be working on today might feel a little random, like you don't even necessarily know what this is if you haven't gotten super deep into Archon, but that's okay, because it's not the point to understand the little individual pieces of work I'm doing, but more just to see my high-level process for AI coding and how I'm using Archon, and just generally how I'm incorporating skills and context for my coding agent.
19:58
Speaker A
Because the thing is, I wanna be really clear here. We are not just vibe coding. Right, like when I use Archon to handle a GitHub issue, I didn't show this in the live stream just for the sake of brevity, but generally I'll spend a good amount of time looking
20:17
Speaker A
So that's what I want to show with you guys.
20:34
Speaker A
Okay, so Archon Setup Wizard does not offer Pi as an AI provider option. And so this is just like we added Pi as the third coding agent that's supported by Archon and we just, I mean, this is my bad. I forgot to add
20:47
Speaker A
And I think there's definitely going to be like each individual live stream I do, the topic is more just based on what am I working on right now.
21:01
Speaker A
work on this right away is because I'm very, very confident in the description of this issue. The way that I use Archon most of the time for brownfield development is the input into any Archon workflow is always the context from the issue. So
21:17
Speaker A
I think just being able to get a glimpse into how I work on a project for real means that I can't really set the topic too much ahead of time.
21:30
Speaker A
to Claude Code and have a conversation around the proposed solutions before I just send off the workflow to go and implement the fix. And so once we do implement the fix, that's when we have a very comprehensive workflow here. I'm not sure why
21:46
Speaker A
But I definitely did a lot of preparation for this one to make sure that I have a lot of value for you guys, just showing you different ways that I use Archon for, like I said, like brownfield and greenfield.
22:03
Speaker A
this one looks like. Because this is still a good visualization here. You can see that for this workflow, there are many different nodes that we go through here. So we start by gathering context around the GitHub issue. In my experimental one, it's pretty
22:17
Speaker A
So we'll get into all of that here.
22:31
Speaker A
is a bug, we're going to investigate the issue, or if it is a new feature request, then we're going to plan for it. So we also have conditional gates and routing in Archon workflows, because the way that we handle an issue is going
22:43
Speaker A
Oh, I'm so sorry.
22:48
Speaker A
And then we go into the implementation and then we have a process for self-validation as well. And the powerful thing here, because we are being able, with Archon, we can have different coding agent sessions in a single workflow, we can have the validation
23:02
Speaker A
You guys could hear the voice echo.
23:16
Speaker A
on its own work. But you always want to have a separate session to review because there's just so much bias that builds up in a coding agent session. And so we have a separate agent do the review. And this is where you could,
23:28
Speaker A
Ah, man, I didn't realize I didn't have the stream muted.
23:40
Speaker A
first pass validation. And then we do a super comprehensive PR review, depending on the of change that it is. And then we finally just like comment on the pull request for anything that we had to address from the pull request review. So you
23:56
Speaker A
That is my bad.
24:10
Speaker A
just works so well for me for how I like to plan and implement and validate and do code reviews on the pull requests. And so I've just taken like my entire process for AI coding and just bundled it up into this workflow. And
24:23
Speaker A
Well, it's fixed now.
24:38
Speaker A
box if you want, but it's better to like give these into your coding agent and tell it like, okay, load the Archon skill, Take a look at this workflow, this one, this one, whatever one is close to what you want to build or
24:50
Speaker A
Okay, shoot.
24:55
Speaker A
Because you probably have your own process for fixing GitHub issues. This is more just inspiration. You can even like take ideas for the specific nodes here, but I want you to build your own because you have a different process. You're using different skills
25:08
Speaker A
That's a bummer.
25:22
Speaker A
actually very, very easy. I know quite a few people that have taken this specific workflow, because I use this one all of the time, and they molded it to use their task management software. I heard of a couple people doing linear, a couple
25:36
Speaker A
All right.
25:49
Speaker A
is why Archon is the open source harness builder. So, okay, let's take a look and see how far we are now. Okay, so we got an issue comment here. I can also go back to Claude Code. and I can tell it to give me a status update. So we can track the
26:09
Speaker A
So I have it muted now, so hopefully it's good.
26:23
Speaker A
coding agent you're using to dispatch Archon workflows can look in, like, just peer into the process. And so that's another really cool thing with Archon is assuming you trust your workflow, it's very fire and forget. I just asked my second brain to
26:41
Speaker A
But shoot.
26:55
Speaker A
ask it to kick off more Archon workflows. My second brain becomes my command center where any piece of coding work that I want to accomplish, I just dispatch to my Archon CLI, right? Like go do this, go do this, go do this. And
27:09
Speaker A
All right.
27:29
Speaker A
Okay, that's a minimal thing. So it's looking good here. We have the three workflows that are successfully running handling an issue. And then once the pull request is created, that's basically the output of each one of these workflows. So the way I
27:45
Speaker A
Will there be a possibility to configure the community skills then through a pre-configured workflow from the Archon skill creator?
27:58
Speaker A
of input and then you have some kind of artifact that is going to be your output. For me, most of the time, that is going to be a pull request. Most of like for any kind of a brownfield development that I'm doing, or
28:12
Speaker A
Um...
28:24
Speaker A
in some feature branch and this is how enterprise development always has been and always will be. You always work in some feature branch, create a pull request so that then you have a time for human review before you merge to your staging branch
28:36
Speaker A
So I'm not sure if you're talking about skills that you build into Archon workflows, but the plan is when you submit a workflow here, it's going to be a bundled asset for everything that the Archon workflow needs.
28:44
Speaker A
It'll create a pull request at the end. And then for brownfield development, the artifact that it outputs, I guess, is really like the entire code base, but you could even like have it, you know, initialize a Git repository and push things there, so
28:58
Speaker A
So it'll be the core YAML file for the Archon workflow.
29:09
Speaker A
So yeah, I hope that sounds good. I'm gonna spend a little bit in the chat with you guys here before I go on to the next thing, and we'll just wait for these workflows to finish here.
29:24
Speaker A
It'll be any scripts that you're running with any of the deterministic nodes.
29:40
Speaker A
review workflow We have a couple of those as well like comprehensive PR review You can take this and mold it to your code base so that it's running the specific commands I could test coverage for like you your code base not our con
29:58
Speaker A
And then it'll also be any skills that you're incorporating into the workflow.
30:17
Speaker A
skills or commands that you want to incorporate into the workflow, those are just gonna be markdown documents that you package up with the workflow. So for example, this Archon comprehensive PR review workflow, it has a lot of commands that we call out here.
30:34
Speaker A
So it'll be like an entire package that this command would install together.
30:49
Speaker A
commands. And then we have all of these. So if we go to the defaults folder, This one is called Archon sync PR with main. So if I search for this, you can see that this is just a very classic slash command. That's like
31:03
Speaker A
So you would have whatever you host, basically what I'm thinking about doing for the workflows here is you build the bundle for your workflow and you host it in your own repository, kind of like an NPM package.
31:17
Speaker A
of lines of inline prompts in a workflow, this is how you do it in Archon. You have the main YAML which gives the nodes, like this is the structure for the workflow, and then all the prompts can live as separate commands. And you
31:28
Speaker A
You know, like if you have any kind of package marketplace for Python or TypeScript or whatever, like NPM, you host it yourself and then there'd be some kind of registry that you'd make a pull request into and then that would link out to all the assets you have for each individual Archon workflow.
31:43
Speaker A
workflow where I have a skill incorporated. So if I go .archon, workflows, this is one of the custom workflows that I built for my dark factory experiment. Which by the way, if you don't know, my dark factory is a code base where I'm
31:54
Speaker A
That's the plan for at least the proof of concept that I'm going to be doing for the workflow marketplace.
32:07
Speaker A
and I search for skills, the way that you add skills into a node in Archon is just one of the parameters that you set for the node.
32:17
Speaker A
And I'm excited for this.
32:31
Speaker A
so these can just be the skills that you have in your .cloud skills directory in this code base. So I'm saying I want the agent browser skill and it's going to pull that from the skill.md that I have. If I go back here,
32:45
Speaker A
This is a very, very requested feature because right now people, if they want to share workflows, they just have to send a link to their GitHub repo and tell people to clone it and copy over the files or just share a zip or whatever.
32:59
Speaker A
Archon, we have a full software development lifecycle and there are certain nodes where we want to use a skill and other nodes where we don't want to. Like the agent browser skill, this is for browser automation, which is a great skill for
33:16
Speaker A
But now there'll just be a single install command.
33:25
Speaker A
So yeah, skills are pretty context efficient because we have the idea of progressive disclosure.
33:30
Speaker A
Pretty excited for this.
33:43
Speaker A
the task at hand. So like during implementation, why even give it this skill if it's just going to cost tokens and potentially lead to a bad decision of using it when you don't, when we can just hold off on giving it that skill
33:57
Speaker A
So yeah, we'll get to that in a little bit, but I want to start by giving a quick elevator pitch for Archon and then I'll show what it looks like to work on some GitHub issues because I know there's a big backlog of things.
34:14
Speaker A
What is the best practice to integrate Archon with the second brain? Yeah, that's a good question. So I talked about this very briefly, but I would recommend having some kind of index of the repositories that you work on, like what I
34:29
Speaker A
It's crazy how easy it is for people to create issues and PRs now in open source projects.
34:47
Speaker A
like to use Archon specifically with that code base. And this is what you would evolve over time as you are using your second brain with Archon to work on each one of your projects. And again, I covered this in a full workshop yesterday
35:02
Speaker A
It's because of how fast you can move with AI coding assistance and like, yeah, we can keep up decently well.
35:16
Speaker A
okay, here are the code bases, here's how I use Archon to handle different kinds of requests for each one of them. Right.
35:28
Speaker A
I mean, there's a ton of closed issues and merged pull requests already, but holy cow, the list just keeps growing.
35:43
Speaker A
people. So yeah, you definitely can. My second brain, I guess, has two LLM wikis at this point. This is exciting. I've been building something similar to Archon with scripts, execution shells, and governance documents, but this feels more holistic. Well, I appreciate it a
35:59
Speaker A
So we'll knock out a few of these and I'll s...
36:14
Speaker A
probably is Archon, but more just for you. You know what I mean? Archon is kind of like, instead of just being a harness, it is the harness builder. So it equips you with everything you need, dive right into building your own workflows instead
36:30
Speaker A
of having to create the support for work trees in isolation and build out the system for the different nodes and running scripts and then sending that as context into the coding agent. You would be surprised how much goes on under the hood
36:47
Speaker A
when you invoke an Archon workflow. Like we have the whole artifacts directory and there's like structured output and other ways that the nodes can communicate with each other. Obviously the whole like isolation thing is very complicated to handle edge cases like the different
37:01
Speaker A
default branch names you might have for your code base. Like there are so many little things that we've done to handle edge cases and just generally optimize the token usage of Archon and the speed and everything and having like no dependencies so that
37:15
Speaker A
some things can go in parallel while other things are stuck waiting for another thing.
37:19
Speaker A
That's how you intend it. Like this right here, right? We have these parts of the PR review that work in parallel while most of the rest of the workflow is dependent on a linear fashion. Like this node has to complete before this node.
37:33
Speaker A
So yeah, there is a lot that we have built here. We support conditional nodes. We support routing. We support looping. Bash scripts, TypeScript scripts, Python scripts, and then obviously adding more support for different coding agents.
37:49
Speaker A
Like we're trying to make it so that you have no need to build your own HodgePodge system anymore. Like Archon gives you the foundation to then build anything. All right. What's the difference with N8N? Yeah, so actually it's funny you ask that because in
38:10
Speaker A
the readme here, I say, think N8n, but for software development. So there definitely is a connection there. Because when we look at something like this, it kind of looks like an N8n workflow. In fact, we have a very
38:23
Speaker A
similar viewer here. N8n doesn't use React Flow. React Flow is something similar. I think N8n uses something with Vue as their framework. I mean, anyway, that detail doesn't really matter. But yeah, it looks like N8n. Now, the thing is, or
38:39
Speaker A
the thing that makes this really different from N8n is, you don't really have support for working with coding agents directly in N8N. It's more for just general business and AI automation, which is useful, definitely has its place. There's a lot less N8N content
38:55
Speaker A
these days, but it's still a great platform. But it's not for AI coding. N8N doesn't handle work trees in isolation like Archon does. It doesn't support different providers that you can inject as nodes automatically. It doesn't handle passing context between different coding
39:12
Speaker A
agent sessions and being able to decide, do you continue the context or do you start fresh? It's not designed for AI coding workflows at all.
39:23
Speaker A
There's so many things in Archon that make it where it really is an apples to oranges comparison. So really, really high level, you can think of them as kind of similar because they're workflow builders, but it's workflows for very, very different purposes. I
39:38
Speaker A
hope that makes sense. All right. Do you place files just like they are into your second brand or do you change them with dockling before putting them in? If you were talking about the system, the repository wiki system I have here,
40:00
Speaker A
I just have my coding agent help me build these markdown documents over time. I'm not using dockling because I'm not doing any kind of text extraction from documents or any kind of rag or chunking or anything here. I guess I do have a
40:16
Speaker A
bit of a rag system in my second brain with semantic search, but that is outside of what I'm talking about here with my sort of LLM wiki for my code bases. This is all just plain markdown, no need for dockling or any kind
40:30
Speaker A
of chunking or text extraction library for this. So like when I created my repositories.md with my dispatch rules, like here's how I like to use Archon, and how I like to run workflows and the whole index of code bases, I just initially had a conversation with Claude Code where I had it
40:52
Speaker A
help me build this. So pretty much I just listed out, here are all the repositories I work on on a weekly basis or a monthly basis. Here is the main branch for each of them and what I'm working on for each one of
41:05
Speaker A
them. Giving it that initial context, kind of generally at a high level describing the structure I want for my index here. And then it created this. and kind of built the scaffolding for each one of these repository files, but it's all just markdown,
41:20
Speaker A
not working with anything fancy here. And in fact, for most of your second brain, it should just be markdown documents, because you can keep it really, really simple that way. Is a workflow fixed to a specific working directory? Yes, but you can also go out of the working
41:39
Speaker A
directory if you need to. But whenever you create a workflow execution, right, like you use the Archon CLI or whatever to run a workflow, it's going to create a work tree. So we have an isolated local copy of the code base. And then
41:55
Speaker A
whenever we run the coding agent there, whether it's Claude, Codex, or Pi, the working directory is set to that work tree so that it's in its own isolated environment wherever it's working. Now, coding agents are generally able to look outside of their current
42:11
Speaker A
working directory. Like, if I go back to Claude code here, right now it's working in my second brain repo, but I could say like, go look at, and I could say, see users, colam, open sort, like I could point it to some other
42:25
Speaker A
folder and it can go look there, but by default, unless you prompt it explicitly, generally it's just gonna stay within its current working directory. And if you want to enforce that, you definitely can, but that would just be through like custom hooks that
42:38
Speaker A
you would build. So like one thing that I've done for my second brain is there are certain folders that are off limits to my second brain. So I have a hook that runs, like whenever Claude tries to read or write out to some
42:51
Speaker A
other directory, I run the hook to make sure that it's not one of my blacklisted directories. And then I'll just stop clawing its tracks and tell it to do something else if it tries to access one of those directories.
43:05
Speaker A
Alright. I switched AI providers. How satisfied am I with AI providers so far? With AI providers. Um... I still use Claude code. I'm a bit confused by the question because I still use Claude code for pretty much everything. I've been experimenting with Pi and Codex and I'm pretty impressed with them. Claude
43:27
Speaker A
code is definitely not universally the best coding agent anymore. It's a very close race, especially between Claude and Codex, but I still am using Claude primarily.
43:39
Speaker A
And then how do you measure 10x again? Shouldn't everything be done by now given how fast everything is these days? Yeah, so obviously I don't have some hard and fast benchmark for measuring, okay, I am 12 times as
43:53
Speaker A
fast as I was a year ago or 22 times as fast. There's nothing like that. Like I said at the start of the stream, when I say Archon has 10X my AI coding, I genuinely believe that, but also I don't think it's exactly
44:08
Speaker A
10X. Maybe it's eight, maybe it's 12, whatever. But it's more just the point of I'm able to get so much work done now Not necessarily because everything finishes really quickly. Going back here, we can look and see that we're still running these workflows.
44:24
Speaker A
They're longer running workflows. They're gonna take their time. When I say I'm 10x faster with Archon, it's not that the workflows are blazing through work. It's more that I can run a lot of these at the exact same time. Now that these are
44:39
Speaker A
running, I can go back to my second brain and I can say, okay, great.
44:46
Speaker A
Let's go and do XYZ now too, right? Like I can continue this conversation. I can do so much work in parallel. And then also the really powerful thing with these Archon workflows is yes, they take a while, but also they're fire and forget,
45:01
Speaker A
right? Like I can invoke these five workflows and then go record a YouTube video or I can go do a live stream. Like I actually have some other Archon workflows running on a different machine right now as I'm live streaming here. So I'm
45:12
Speaker A
getting some other work done as I am building with you guys live And so you could also go like take a break, walk your dog, go to bed, whatever it is. Right. So it's, it's faster, not always in the sense of like these
45:24
Speaker A
things blitz through work, which you could build a workflow that runs quicker if you wanted to. But I care more about like, let's get the best output quality possible.
45:33
Speaker A
I can run these things in parallel fire and forget. So it's still, you know, exponentially increasing the quality and speed of my, my code output. Right.
45:46
Speaker A
Cool, so while this is still running here, let me check one thing quick. Do a quick time check here. Okay, been going for 45 minutes. So for those of you who are just joining me, what I'm doing for the livestream today, so I promised I would give quick recaps once in
46:07
Speaker A
a while. So if you're just joining the livestream or you've been around for only a little bit, What I'm doing right now is showing you how I use Archon, my open source harness builder, to create workflows that package up my AI coding process
46:21
Speaker A
into something that's very repeatable and reliable that I can run at scale. And this is a peek, a glimpse into what my engineering looks like these days. So currently I'm handling a few GitHub issues with Archon. These are the workflows that are
46:37
Speaker A
running right now. And I'm also gonna be doing some green field development. So I'm actually gonna get into this right now, and then we'll have more time for Q&A in a little bit here. But while this is running, just to show you generally
46:48
Speaker A
how I handle GitHub issues, which GitHub issues are usually the input into most of my brownfield development these days. I'm also gonna show you some more green field development.
46:57
Speaker A
So I'm gonna take a, pretty new feature that I want to build into Archon, show you what it looks like to go through a more intricate planning process using a framework that I call my PIV loop. Maybe you've heard of me talk about
47:12
Speaker A
the PIV loop before. It is my most foundational approach for AI coding. You plan out a scope of work with a coding agent. You delegate all of the coding to the agent and then you have it do its own validation and then you
47:27
Speaker A
also do your own code review and manual testing. That is my process. Every single time I'm working on something more complex that really requires me to be a part of the process with the coding agent, I always use the piv loop. And for
47:43
Speaker A
a while before I had Archon built, I taught the piv loop with a set of commands and skills. So here's my command for planning. You go through and you create the markdown plan document with your coding agent, and then you iterate on that,
47:57
Speaker A
and then you send it into the coding agent for implementation. There's a separate command that I had for implementation. But we can take this entire process and we can package it up as a single piv loop workflow. And I don't have the full,
48:11
Speaker A
full, full piv loop in an Archon workflow yet. That's actually something I'm going to be building in a Dynamis workshop in the next couple of weeks here. but I do have a simplified version of it, which also just for the sake of speed
48:25
Speaker A
here for our livestream, it's good that it's a simple version because I wanna go through this pretty quickly with you guys, but I have this Archon PivLoop workflow. This is one of the defaults that ships, so you could use this yourself right now
48:37
Speaker A
if you want or adapt it to your own harness like I was talking about earlier. We're gonna use this to build out something pretty new here in the Archon codebase. So it's greenfield development in the sense that I'm still working on an
48:51
Speaker A
existing codebase, but unlike these GitHub issues where I'm just working on minimal bug fixes or features, I'm gonna be creating a brand new part of the Archon ecosystem, because I wanna build this kind of workflow marketplace, a place for people to share
49:09
Speaker A
and install other people's workflows. with a single command. And so I have a quick proof of concept that I built out actually in my last live stream. And so I'm basically just showing this as a demo of what I want to build,
49:25
Speaker A
but I want to build it from scratch here, because there are actually quite a few different design decisions that I've decided on from when I built this first version.
49:32
Speaker A
So I have this just stashed away in some feature branch. I'm gonna show you guys going through the PIV loop to build out the initial version of this pretty much from scratch. And so let me go back to my conversation
49:44
Speaker A
here with Claude. I'm gonna start a new one actually. And let's see, let's scroll up here. Because I have a prompt pre-crafted, just so you guys don't have to watch paint dry as I am getting started with the work here. So let me go ahead and find this. Okay.
50:09
Speaker A
So I'm gonna have it read this file. So I didn't do too much planning before the live stream. Like, don't worry, I'm still showing you the full process here. But I did a little bit of like planning out what I wanna do
50:21
Speaker A
for the live stream today. So I'm gonna say, read this and let me know what, or I should say, and summarize what we are building for the workflow marketplace. So whenever I use, the whole PIV loop workflow, I want to start by loading a little bit of context.
50:42
Speaker A
Since I don't have a GitHub issue or something to reference here, I want to curate a little bit of context around what exactly I want to build. Then I will invoke the PIV loop workflow. Okay, so core idea, a
50:57
Speaker A
metadata-only registry where community members keep their workflow YAMLs in their own public GitHub repo.
51:03
Speaker A
So it's like the NPM packages, that's the example that I used earlier. And then we submit a pull request to the Archon doc site, adding one entry to a typed registry file. So I'm keeping this really, really simple. You share the workflow by
51:17
Speaker A
hosting it in your own repo so you can manage the versioning and evolving your own workflow. But then you'll create a pull request so that we have some kind of registry that lives in the Archon codebase that points to all of the workflows.
51:29
Speaker A
And then we'll just have to add a banner to the page here, just giving a warning that we haven't necessarily vetted all of the workflows here. It's definitely something that would be pretty impossible to maintain if we actually had a vetting process,
51:44
Speaker A
especially because then people would have to create a pull request just to update the version of a workflow. Okay, so we have two phases. What are the two phases? There are two phases that I have planned for the livestream today. I wanna start by building
52:06
Speaker A
the actual marketplace. And then the second phase is I want to have a Archon workflow. This is cool, I'm gonna be building an Archon workflow that basically does an automated review whenever someone submits a pull request to add a new workflow to the marketplace.
52:30
Speaker A
So we have the, yeah, I'm referring to number one. I'm talking about the two parallel PIV dispatches. So yeah, like I said, I've done a little bit of planning up front, but you can imagine that like this is work of coming to me
52:44
Speaker A
from a product manager. Generally as a developer or even as an entrepreneur, you're gonna have some context from someone else or the internet or whatever for like here's the thing that I need to build. And so I have a bit of context already,
52:58
Speaker A
but now I'm going to go through a couple of piv loops in order to plan more specifically how we're gonna implement this in the code and then get into the implementation. So we're going to, let's start with the first, or here, I'm gonna actually use speech to text here, because I'm sick
53:16
Speaker A
of typing. I'm going to use my AquaVoice here, my speech to text tool, and I'm gonna say, let's start with the first piv loop for the Marketplace V0. So I want you to load the Archon skill, and we'll use Archon, the piv
53:31
Speaker A
loop workflow, to build this out in a feature branch. So I could do both at the exact same time, but generally for these workflows that have more human in the loop and I really want to be a part of
53:46
Speaker A
the process, I find it to be pretty mentally draining if I have many different coding agents all asking for my review and attention between the different runs. And so at least to keep things simple for the stream, I'll just have it be very
54:01
Speaker A
linear right now where I'll just work on a single PIV loop to start. But you can see that it's set up here where it is offering to do parallel PIV loops if I wanted. It just can be kind of
54:16
Speaker A
tough to juggle different plans in your mind as you are reviewing things before you send it off to implementation. But it's definitely possible. So the PIV loop is dispatched.
54:27
Speaker A
It's going to do exploration over the code base to get a general idea of how are we actually going to implement this Marketplace v0. And then it'll come back with questions for us before it writes the full plan. And so we'll see that
54:40
Speaker A
going back to the UI here, we have four workflows running now. We have the GitHub issue fixes. And by the way, these are only running for a long time because I've designed it that way. Like we can make this a lot shorter if
54:52
Speaker A
we wanted to, but I want these to be very comprehensive, especially with some things that I'm experimenting with. And then now we also have the Archon pip loop. So this full process that's running right now, starting with the exploration at the top. Pretty
55:06
Speaker A
cool. And so we'll wait for this to run and come back with questions. And I can also say, give me a status update to see where we're at with the GitHub issue fix workflows. All right, so let's see what it says here. So it
55:26
Speaker A
reads the logs for each one of the workflows. And then, apparently Klauge is thinking for a long time for some reason, but it'll output the summary in a sec here.
55:43
Speaker A
I'll go back to the chat while we wait for this. What has been wild for me is my main Claude session has apologized to me for missing bugs that were found with the Archon of Comprehensive PR Review. Well, yeah, I mean, that's how coding agents are. They're very
56:02
Speaker A
sycophantic, right? So if you say like, hey, you did this thing wrong, they'd be like, oh, sorry, sorry, sorry, let me go and fix that or whatever. But that's exactly why it's important to have a separate coding agent session to do
56:15
Speaker A
validation. Because the coding agent, when it checks its own work, it is always always, it is always, always going to miss things. All right.
56:28
Speaker A
So here we go. Massive progress and the recursion bug is biting hard. Okay, I'm not actually sure what this is referring to. So maybe we'll dig into this for a sec here. So the PI provider, let's see, codex rejected opus. I'm not sure why it's using codex. That's kind of
56:46
Speaker A
weird. Anyway, implement, Opus got routed to Codex, which rejected it. Oh no. I think I might have been messing with some things in my local environment that made it so I was trying to go to Codex instead of Opus. That might
57:12
Speaker A
be my bad. I'm trying to figure out, once PRs merges and the YAML gets provider clawed added, redispatching will produce the real fix. Okay.
57:27
Speaker A
Interesting. Yeah, so, I mean, I was messing with some things in my local environment, so I kind of shot myself in the foot here, but at least we actually do have a pull request created. So let's take a look at this. I'm going
57:38
Speaker A
to go over to my browser here, open up the PR. So this one... That's not the right pull request. Okay, here, I'm actually gonna tell it to do something different. Okay, I want you to cancel these Archon workflows, and then I want
57:56
Speaker A
you to change it so that it doesn't use codex, fix it so it actually uses clod, and then I want you to rerun the workflows. So I think that's actually a problem of the workflow that I used, because I was, like I said,
58:10
Speaker A
I'm using an experimental GitHub issue fix workflow. I probably shouldn't pick the experimental one for the stream, but if we go into, sorry, let me go I have to go to the Archon repo. Let me go over to Archon. And then I will
58:23
Speaker A
go to Archon workflows, defaults, or this one, Archon, fix, get up, issue, experimental. Where was it using? Oh, I guess it already adjusted it, because the provider's Claude now. So we should be good now. But it was trying to use codex earlier. So let's go. Okay.
58:49
Speaker A
Alright, we'll let it run again. But anyway, we have the piv loop running in the other conversation here. So, the explore phase is done and it's waiting on six questions. One important finding. The roadmap doesn't actually exist. The
59:03
Speaker A
marketplace will be the first custom Astro page. So that's only because it's in a different branch. But okay, so I'm gonna go through each one of its questions here.
59:12
Speaker A
So, for the CLI namespace, should it be Archon Workflow Search or Archon Workflow Install?
59:20
Speaker A
or a separate marketplace search marketplace install. Let's do Archon workflow install. Like I'd want the command to be based on the other pre-OC where it's like Archon workflow install and then the ID of the workflow. Marketplace page styling, wrap a custom page in Starlight. Starlight page component for consistent Chrome or standalone
59:41
Speaker A
pages with their own styling. So actually look at the dev branch for Archon. because I do have the roadmap already implemented there, so I want you to find that and just mimic that. Then for number three, just go with
60:02
Speaker A
whatever you recommend for that one. I'll trust your judgment on that. And then for number four, should the PIV explore via GitHub API to pick real workflows or do you want to specify the ones now? Yeah, for the seed entries, let's just pick
60:17
Speaker A
Actually, I suppose I can pick the Archon workflows for the seed entries quick. Let's do the piv loop code review, GitHub issue fix, and let's see, I guess I need one more.
60:35
Speaker A
So these are just the workflows that are going to show up on the initial version of the site when I build it, kind of like what we have right here. For the other one, let's do the ralph loop one. That'd be cool to
60:45
Speaker A
have the ralph loop. Those can be the ones that we seed. Then for the marketplace.json flat array of entries or wrapped, let's just do a flat array of entries. Then for the governments doc, Good question.
61:23
Speaker A
I think you're overcomplicating things. Let's just add a section to contributing.md for contributing by creating Archon workflows. And then the marketplace trusted, I don't think we actually need that. All right. Yeah, I just want to simplify things.
61:41
Speaker A
Okay, cool. So now what's going to happen is I've given all this feedback, right?
61:46
Speaker A
Like this is the human in the loop in the Archon workflow. The Archon workflow paused, asked me these questions. I will, I give the answer here. And now you can see that it's going to run the approve command on the Archon workflow. So
61:59
Speaker A
it approved and then it passed in all my answers. And now the Archon workflow continues to the next step. And so I actually probably should have showed it here.
62:07
Speaker A
But the piv loop, the Archon piv loop workflow was in a pause state for a little bit. Like if I had shown that in the web UI, it would have showed like it's paused, waiting for my input. Obviously now it's running again because
62:18
Speaker A
it ran the approve. And then let's see what happened here. Okay. Still trying to, I probably shouldn't have used experimental workflow, but it's all good. Because you guys got the general idea still of how I use Archon to handle GitHub issues. Okay.
62:44
Speaker A
So now the PIV loop is either going to do another explore round if it needs more clarity, or move to the plan creation. And so really it's like, The coding agent gets to reason now. We have this built into the PIV loop workflow.
62:56
Speaker A
Should I keep exploring or am I confident enough and the user has answered my questions in a way where I know that I can go to create the structured plan. That actual markdown document that outlines everything, all the context we need to go
63:10
Speaker A
into the actual implementation. So if I go into the PIV loop, we can see, well actually let me go to workflows. I think my problem is I don't think I'm in the dev branch for Archon. I think that's why I'm running into the
63:22
Speaker A
issue with the other workflows. But let me go into the PIV loop here and just show you what this looks like. So let me click in. So like this, that's the code review, hold on. That's our implement and this
63:38
Speaker A
is our refine plan. This is our create plan. So if it goes through the exploration phase and it figures like, okay, we're good to go now, and then it's gonna go onto this prompt here. This is the structure plan that it's going to
63:52
Speaker A
create after our initial back and forth with the exploration. So basically, I want the plan document to have the exact same format every single time. So it makes it easier for me to review and it makes my whole process more reliable
64:07
Speaker A
and repeatable. And so just in simple markdown here, I'm describing like, here's the exact structure that I want for the plan. The summary, the mission, the success criteria, what's in scope and out of scope, the code-based context, like here are the files that
64:20
Speaker A
we have to edit and create to implement this new feature. That's what it's going to create here once it gets to that stage in the workflow.
64:30
Speaker A
Okay, the plan looks solid. Okay, give me the full path to the plan. So it looks like it actually created the plan.
64:41
Speaker A
So it paused again to send that to me so that I can review it.
64:46
Speaker A
Right. So like this is the second time that it's been a there's been a pause here. So let me actually open this. So I'm going to go to my browser here or my file explorer.
65:05
Speaker A
I hate Windows. My file explorer just froze. I hate Windows so much right now.
65:11
Speaker A
Come on. Where, where are you? There we go. Artifacts. Wait, that's not the right path. No, that isn't the right path.
65:31
Speaker A
I need the exact path to the plan. So usually it gives me the path. I'm not sure why it didn't. But yeah, I needed to give me the path so I can actually review it. And then once I review the plan, I can either give it feedback or I can say ready.
65:49
Speaker A
Or wait, hold on. Oh, I'm sorry. That's my bad. It's still in the explore phase. It came back for more input from me. And then I can say ready and then it'll move and write out the plan. So, okay, that's my bad. I
66:02
Speaker A
just read that wrong. Which is good, because I want it to stop for my approval with this stuff before it goes and just creates the full structured plan. So one question before I say it ready, do you want the marketplace URL to be
66:15
Speaker A
slash marketplace or take over the existing slash workflows? Yeah, so I wanted to take over the existing slash workflows, because that was just a separate proof of concept that I did. And so, okay, given that, I am now ready to create the plan.
66:28
Speaker A
Okay, so now it's going to, sorry, that is my bad. Now it is going to resume the Archon workflow and let it go past the explore step now into the creating the structure plan step. All right.
66:44
Speaker A
Okay. You know, that other pull request that you created, you didn't actually make. That was an old one. So I need you to dispatch that third workflow again too. Sorry, I need to correct that. it's only running two
67:00
Speaker A
workflows, but it needs to run all three. Okay, there we go. PIV is now creating the plan. I'll let you know when it's ready for your review. All right, good. There we go.
67:13
Speaker A
All right. I'm going to take a sec to look at the chat now as we're waiting for this.
67:26
Speaker A
Will we be able to comment on a section of a community workflow to say what we would optimize or which variants are useful? And then the comments can be rated so we can see possible good variants. That's a really good idea. Now the
67:37
Speaker A
thing is, I wouldn't be able to do that yet because this initial version of the workflow marketplace is not gonna have any kind of authentication, but that is in the roadmap. Like I definitely want to evolve it to get to that point where
67:48
Speaker A
we'll have like full authentication with like a comment section and uploading and everything. That just won't be there for the V0 of the marketplace.
67:58
Speaker A
Anyone found a good tutorial to integrate BMAD with Archon? You know, Eric, honestly what I'd recommend is just cloning the BMAD repository locally and then pointing cloud code to it and saying, hey, I want to build BMAD or
68:14
Speaker A
like these things from BMAD into Archon workflows. And it should just load the Archon skill and help you ideate there. My recommendation would just be like specifically to ask you a lot of questions. so that you guys can get on the same page
68:28
Speaker A
with that, with like what features you wanna take from BMAT, or if you really wanna try to copy the whole thing, which might take a while, but you could.
68:38
Speaker A
What Archon does is basically keep the agent more grounded with targeted context. It can confuse on specific context that is not relevant. Exactly, yeah. One of the best parts of having a full workflow where we are stringing together different coding agent sessions is
68:52
Speaker A
we can make each session very, very focused. And that's powerful for coding agents because they get overwhelmed with context just like people do if you try to have it do too many things in the same conversation.
69:08
Speaker A
What about having Archon run as a standalone without the use of a CLI? Why have the models run within Archon to power it? So, I mean, we don't really want to compete with other coding agents. So I always want Archon to
69:23
Speaker A
run Claude Code or Codex or Pi under the hood. I don't want to have the model run directly in Archon and have it be another coding agent competitor. I think that's what you're asking. I'm a little confused by your wording. Because
69:38
Speaker A
Archon is running as a standalone already. I don't know why you wouldn't want a CLI. So yeah, maybe you could clarify.
69:48
Speaker A
Is it possible to let a workflow run without a work tree? Yes, that is very possible. Actually, let me show you. So if I go into Archon here, I'll just open up a new session. How do I run a Archon workflow
70:07
Speaker A
without a work tree? work tree. So I'm just doing this to show you that when Archon has the Archon skill loaded, it has full context around workflows and parameters and how to run them, and so it'll be able to grab the answer from
70:19
Speaker A
me very quickly. But we have a no work tree flag that you can specify when you run a workflow. And so that way, if you just want to operate directly on your code base and you don't want to deal with a separate local
70:33
Speaker A
copy, you can just do this. So this is actually how I generally recommend using Archon when you have a non AI coding workflow. And yes, you heard that right.
70:42
Speaker A
You can use Archon for a lot more than just coding. Really any agentic workflow you can create as an Archon workflow. And a lot of those where you just have like a single output file, like a video or a PDF or something, you
70:53
Speaker A
don't really need a work tree for that. That's more just useful for any kind of like code changes that you are making. All right.
71:03
Speaker A
Let's see where we're at. All three are working with default assistant, Claude, and effect.
71:07
Speaker A
These workflows are in the web UI again. Okay, so we had a little bit of a blip there, but now we have them all running. Cool. I'm going to check, actually really quick, I need to check my Claude usage to make sure that
71:21
Speaker A
I'm not hitting something nasty. Let me look at that real fast off camera. Okay, I've used 41% of my five hour, which is actually not bad, especially because I did a lot before the stream as well. These workflows, they go for a
71:34
Speaker A
long time. But here's another really important thing to understand. Archon workflows are actually pretty token efficient. Because each coding agent is very focused. You're only giving it the context it needs at each step of the way. And you can specify the model
71:51
Speaker A
that you're using in each node. So for some nodes like research, you just need haiku. For planning, you might want opus and then you'll use codecs for review, like whatever it is, you get to choose the provider and model per node. So even
72:04
Speaker A
though these are very long running processes, they're actually pretty context efficient. Like I've been running a ton of workflows and I have another one running on my other machine, like I said, earlier, I did a lot of prep work for the stream this
72:14
Speaker A
morning within the same five hour window and I'm still only at 41% of my five hour window. It's not bad at all. Alright, yeah, so let's go back and make sure that we don't have an update for any of
72:27
Speaker A
these. I'm just waiting for the plan to be created. And again, I've designed the process to be pretty comprehensive, so I'll have to wait a little bit, but it's really up to your prompting to determine how long the process takes, just like we
72:39
Speaker A
have for skills and commands. It's entirely up to how much you're asking it to do here. What's the best way to synergize open source resources like Archon plus Symfony? P.S. non-techie here. I don't really think you would use Symfony with Archon. Symfony,
73:01
Speaker A
that's the thing from OpenAI, right? I looked into this a little bit here. open source spec for codex orchestration. Honestly, I feel like Archon is sort of like, it replaces Symphony. Because Symphony is sort of like their codex harness.
73:18
Speaker A
So I don't really know. Like, honestly, my answer for you is like, I don't think you would want to. Right? Like... Coding agents using linear as a state machine to work alongside us like you can build this whole Linear as a state machine
73:32
Speaker A
into an Archon workflow. You don't need Symphony if you're building Archon workflows So two things I'd say here one is if you want to build an Archon workflow that operates like Symphony You can literally just open up cloud code or codex You can
73:47
Speaker A
you know load the Archon skill Tell it to read this blog post and say I want to build this symphony harness as an Archon workflow. And it can knock that out for you, probably in one shot. The other thing you can do is
74:04
Speaker A
you could I mean, like I'm saying, I don't know if I'd recommend this necessarily, but you could like have it load the Archon skill, not like build something to replace Symphony, but ask it like, how could I incorporate Symphony into an Archon workflow?
74:17
Speaker A
Just see what it comes up with. I really don't know. Like I said, I feel like you wouldn't really use them both together, but you could just see what it says. Because if it reads this and it understands the context of Symphony and
74:29
Speaker A
you have it load the Archon skill, it can combine that knowledge together and maybe come up with some way that they could work together But it really comes down to for any kind of like open source project you want to incorporate with Archon
74:41
Speaker A
or take inspiration from it's just load the Archon skill load in that repo or whatever and tell it how it can combine or build an Archon workflow around that open source project All right, so follow up within Archon settings UI you have an LLM provider list drop down and a drop down
75:00
Speaker A
for the models they provide Yeah, so that that's more experimental that were something we're working on right now. But this will update your environment variables directly when you change these things. So you can have the default provider so that if you don't specify the specific model or provider in a workflow or
75:23
Speaker A
a node, this is what it would default to. So it's kind of like the global setting. So we have a sort of override where in an Archon workflow, we can specify here's the provider I want to use and the default model. And then
75:37
Speaker A
at the individual node level, you can also specify the provider or model. So this takes precedence. What you have set at the individual node level always takes precedence. And then if something isn't specified for the individual node, then what you have as the
75:53
Speaker A
default for the workflow takes precedence. And then if even that isn't specified, that is when it will use the default, like global configuration that you have set as environment variables that you can also manage right here. Alright, oh,
76:08
Speaker A
take a look at that. The Archon PIV loop is now in the pause state.
76:12
Speaker A
So it's ready for our feedback now. We can see that it has created the plan. So let's go back, take a look at that. So the plan looks solid, here's a summary for your review. And then I can say, you know, like give
76:26
Speaker A
me the path to the plan so I can review myself. And since this runs in a work tree, the plan is somewhere that isn't super easy to get access to, but you can just ask it and then I can
76:42
Speaker A
just open the file here. So this is the plan.md. And you can see that it is within a work tree. So this is just the unique identifier for the work tree. That's why we have a random alphanumeric string here. So if
76:56
Speaker A
I open up the plan, now I can review things. And if there's anything that doesn't seem perfect to me, I can just go in here and say, hey, I don't like the validation here, change it to XYZ. Whatever that is, I can have
77:07
Speaker A
it iterate because the Archon PIV loop workflow is a loop. I get to specify any changes I want as many times as I want. We have the summary, the mission, the success criteria. Everything that I showed you in that markdown structure earlier,
77:24
Speaker A
it has built that exactly here. It's created the structure plan that we can then send into the next node in the Archon workflow for implementation, assuming that we actually agree with everything in the plan here. So here's what's in scope, here's what is
77:38
Speaker A
out of scope. Like I dropped the marketplace trusted for example. You remember me specifying that earlier when I was giving feedback in the exploration phase. Here are the key files that we need to create and update. Like we're updating the contributing guide. Here's
77:53
Speaker A
the pattern to follow. So I had to look at how I built the roadmap within the dev branch. And what else here? We got the architecture, task list.
78:02
Speaker A
I mean, I don't need to go through all of this. The point is it's very comprehensive on purpose because this structure plan is literally going to be all of the context that I send into implementation for Claude to handle the actual
78:17
Speaker A
build. So if I go to the workflow here, I just wanna show you the YAML for this. Sorry, that's the wrong repo. I want to show you the YAML for this really quickly. So let me go out and then back into defaults. I'll
78:27
Speaker A
open up the Archon PIV loop workflow just to show you the structure that we're going through here. So we finished the exploration. This is the prompt that I had for the exploration, right? Like the argument specifies here is what we're building. And
78:42
Speaker A
then we have a loop. So we're looping until the plan is ready. So this is what we saw where it came back to us with questions. It asked us questions and then after it asked us if we're ready. So we have this whole
78:55
Speaker A
process where it's interactive. We get to talk to it as much as we want.
79:00
Speaker A
And then it creates the plan. And so it depends on the exploration being done, obviously. We don't want to run these in parallel. That's how we make sure they don't run in parallel. And then we're starting in a fresh context session. Because
79:12
Speaker A
basically we're just going to pass the final... output from the explore node. This is the cool part about Archon, is we can start fresh sessions, but still pass the core output between different nodes. So here's what happened in the exploration. Here's originally what
79:27
Speaker A
the user's looking for. Now let's create a structured plan. So it's gonna follow the exact convention that we have laid out right here. This is the same prompt that was showing in the Archon web UI. And I'm specifically telling it to output the
79:42
Speaker A
plan document to this special artifact directory variable. So this is a variable that works by default for every single Archon workflow. This is like the workspace for the workflow execution. And so this is the path that I was showing
79:58
Speaker A
earlier that had that random alphanumeric string, that was like the ID for the directory.
80:03
Speaker A
So that way all the artifacts and files and everything that we're passing as context between nodes, it all lives in a single place that's isolated for that workflow execution.
80:13
Speaker A
So we create the plan and then we have the next step to refine the plan. And so this is where I can optionally give feedback on the plan. So if I give any kind of, like, I need to change this, then it's going
80:27
Speaker A
to go through this loop here. So it's just like the exploration where it's going to loop infinitely, or I guess until max iterations, just for the sake of token efficiency, I want to have a limit there, but it'll iterate up to 10 times
80:39
Speaker A
if I really have that much feedback. And so for the sake of brevity for the stream, I don't really want to iterate that much on the plan. But you can imagine that as you're going through something this comprehensive, there's probably going to be
80:51
Speaker A
some things that didn't quite nail down, either technical implementations or just general high level what you want it to build. And so this is your chance to validate and make sure that the coding agent really understands what needs to be built, the different
81:03
Speaker A
risks that we have here. Maybe you want to address some of these risks in the plan before you go into implementation. It's really up to you. The point is this document now is going to serve as the only context going into the actual
81:16
Speaker A
implementation. So we can see here that we're starting a brand new session here. So let me scroll down. So actually, it's kind of cool. One thing we do before we go into implementation is we actually install the dependencies. Because we are
81:34
Speaker A
running the workflow in an isolated code base, we have to reinstall the node modules or the Python virtual environment, whatever it is. So we do that as a deterministic node, just to guarantee that our environment is set up. Now going into the implementation.
81:48
Speaker A
And for the implementation here, We have it, it's like another loop, it's kind of like a mini Ralph loop, where it's just going to keep building until it has solved all, or it's like, you know, addressed everything in the structure plan. So we're
82:02
Speaker A
saying like, you know, here is the user message, and then we want you to read the artifacts directory, read the plan.md. So it reads the plan, it builds up a task list, and it's going to knock things out one task at
82:16
Speaker A
a time. And then it's even gonna like track the progress in another file that we have in the artifacts directory. Pretty cool. And we set fresh context to true because we don't need anything from the prior planning session. It is literally just that
82:32
Speaker A
plan.md that we're passing in as context for the implementation. And so let's go back to cloud. Let's actually kick that off. So I'm gonna say here that plan looks great. Let's implement.
82:46
Speaker A
And again, if I wanted to give feedback here, I could, and then it would just make those changes to the plan, I'd review it, and then I'd just come back and say this at some point anyway. And so we can see that Archon
82:56
Speaker A
workflow approve, and it just said approved, right? Like there's no feedback I gave. And so now the implementation is running. And so if we go back to the web UI, we can see here that it's no longer paused, it is currently implementing. Pretty
83:09
Speaker A
cool, and then while this runs, Let me go back here and say, give me a status update. I just wanna see where we're at right now with the workflows that we're invoking for the more like brownfield development here.
83:23
Speaker A
So a lot of loops. Exactly. Yeah, you know, loops are pretty important. I mean, maybe you don't realize it when you're working with Cloud Code, but you're doing loops all the time. Iterative feedback, having it like kind of loop over its own work.
83:38
Speaker A
You generally never want to accept the first pass output from a coding agent because it's never that good, honestly. The raw output quality from a coding agent with just a single pass is pretty bad, but what makes coding
83:56
Speaker A
agents really reliable is when you have a process for creating a plan and you refining it in a loop Right, and then like it implementing and kind of checking its own work and iterating again in a loop. And then you have the validation
84:10
Speaker A
process that is yet another loop. Like everything is just loops and loops because you want the coding agent to have a process of planning through things and self-correcting. And that's what we're building into the Archon workflow here. It's really easy to build these
84:22
Speaker A
kind of loops into Archon. Okay. The implement nodes are still routing the codex. Sorry, guys. I don't know why it's routing the codex. Oh, shoot. That's because I set that.
84:39
Speaker A
Let's see. Do number two. Well, validate is compensating. I already produced a clean PR at the real fix. Okay, let's do option one. but still fix the config so it uses Claude. So that, sorry guys, that's my bad. So I
85:04
Speaker A
was doing some testing of codecs earlier. But the thing is like the node failed, but the workflow as a whole is still succeeding. And I guess that kind of highlights something else that we have built into Archon, which is resiliency, right? We have
85:18
Speaker A
a lot of workflow durability built into Archon. So if a node fails, it can retry. If your computer crashes or your Archon workflow crashes for whatever reason, we have everything stored in state in a database that can either be SQLite or Postgres. And
85:34
Speaker A
so you can also resume workflows. It's not fully ephemeral, right? Like we have full state stored in a database so it knows where we're currently at.
85:45
Speaker A
So that's cool. So I guess we get to at least demonstrate that, but still feels a little jank, for lack of better words, the execution here. That's always the fun part of live streams is if something can go wrong, it always will. So
85:58
Speaker A
that's on me for experimenting with things and not cleaning up my environment before the stream. But it doesn't really take away from the demonstration at least. So I'm good with that. All right. How resource intensive are the workflows themselves? I'm wondering how beefy the local hardware needs to be.
86:16
Speaker A
So they're not resource intensive at all. Because most of the time, a workflow is just waiting for calls to the coding agent. So as long as you're not running a local large language model for the actual AI coding, it's not going to
86:31
Speaker A
take much memory at all. Like you could run a ton of like a dozen Archon workflows and it'd still be like, less than a gigabyte of RAM.
86:42
Speaker A
Unless you're using skills that take a lot of memory. So it's just like with Claw and Code. Generally it's a very lean process, but if you're doing browser automation, so it's actually running a Chrome instance or something, then it's gonna take a lot
86:55
Speaker A
of memory. So one thing I ran into with my Dark Factory experiment when I was initially setting it up, is I have the Dark Factory deployed to a VPS.
87:06
Speaker A
that only has like a few gigabytes of RAM. I think it's like four gigabytes of RAM and two vCPUs. And I tried running a ton of Archon workflows in parallel that were all validating work with the agent browser and I crashed my VPS
87:20
Speaker A
because each one of the workflows obviously had to run a Chrome instance to test the website in their own isolated environment, like within the work tree. So it took way too much memory. But assuming that the skills are like, that the Archon workflow
87:33
Speaker A
is using aren't super memory, like taking a lot of memory, then your Archon workflow won't. Talk to me about Pydantic versus Archon. So Pydantic AI is a framework for building agents. It's very different. It's apples to oranges comparing Archon to Pydantic. Like Pydantic AI
87:56
Speaker A
is a way to create AI agents. You could build a coding agent like Claude Code, but still that would be like the agent that Archon is using, because Archon is orchestrating agents. Now one thing Pydantic does have is they have a new harness that they put out, so maybe this is what
88:16
Speaker A
you're referring to. I gotta be honest though, I haven't had a chance to try this yet. I've really, really been meaning to, because I still am a huge fan of Pydantic AI. Now the Pydantic AI harness is the official compatibility
88:31
Speaker A
library for Pydantic AI. So this is like an official addition onto the framework. And the main thing that I would say is that this still isn't specific for AI coding. If you look at some of the examples here, I think
88:48
Speaker A
they're more for like deep research. Let's see. Trying to look at some examples quick. There's definitely some similarities to Archon. It's kind of a harness builder, but it's not specialized in AI coding. I don't think they support work trees and things like that. But I
89:11
Speaker A
am actually thinking about making a YouTube video where I specifically compare the capabilities of the Pydantic AI harness to Archon and talk about when you maybe want to use one over the other. But yeah, this looks really, really cool.
89:31
Speaker A
I'm looking at digging into more harnesses. I'm curious if I should start with Archon or Pydantic or something else. So my recommendation would be to start with Archon if you're looking at building harnesses specifically for AI coding. Now, like I
89:46
Speaker A
said earlier, Archon, you can theoretically build really any agentic workflow as an Archon workflow.
89:54
Speaker A
But I would say that we haven't really optimized things outside of AI coding yet.
89:59
Speaker A
So if you want more of a general deep research agent or something like that, then the Pydantic Harness would probably be better to start. And like I said, I really want to check it out myself more. I've just kind of glazed over it
90:11
Speaker A
up until this point. But it's also cool to see that Dao is working on it. I've actually had a conversation, a couple conversations with him.
90:21
Speaker A
He's a cool guy. So cool to see him working on the Pynantic AI harness.
90:25
Speaker A
And cool to see Pynantic AI continue to evolve. Because I know I don't create as much content on Pynantic AI these days just because there's so much excitement around Claude and AI coding, agentic engineering. But Pynantic AI is still like I'm using it
90:39
Speaker A
every day. Fantastic framework. Well, maybe not every day, every week. Every day is a bit of an exaggeration, but I'm using it every week.
90:50
Speaker A
Building with it every week. How do you update stale information in your second brain? Do you use an automation? So one thing that I have in my second brain, I mean, I could spend an hour talking about my
91:04
Speaker A
whole reflection system here, but one thing I do in my second brain is I have a memory reflection process. It runs once a day to analyze my current memories and extract the core information into a, basically it's like a
91:20
Speaker A
memory promotion process. I have my daily logs that build up over time, basically summarizing all my conversations with my second brain. And the memory reflection process is going to look at those daily logs and take all the important stuff out into a memory.md
91:36
Speaker A
file that I have loaded into every conversation with my second brain. Another part of that memory reflection process is to look at any past memories. I have a system where it kind of just like, spot checks, previous daily logs, and it compares
91:52
Speaker A
that to recent memories to see if there's any information that's gone stale. And I don't actually delete the information, I just mark it as stale. Because I think it's kind of like knowledge graphs, where like graffiti, if you guys remember all my content
92:05
Speaker A
on graffiti, we don't necessarily want to remove the information, but we want to mark it as stale so that our second brain can understand how things have evolved. I want it to still have record of what was the truth before, so that I
92:18
Speaker A
can kind of give more context into what is the truth now. I know it's a little vague, like it's hard to get into specifics right now, but I hope that makes sense. Basically, I just have a reflection process that runs once a day
92:29
Speaker A
that helps with all of my memory maintenance for my second brain. All right. Okay. Summary. Okay, so we fixed our config here, sure. So give us status of the workflows now.
92:49
Speaker A
Let's see where we're at. And then for the implementation, this one's going to probably work a while, obviously. So it's going to go through implementation in the piv loop, and then it's going to do its first round of validation and
93:05
Speaker A
iterating on that. And then once it figures that it's done, it's going to pause for our feedback. So we have yet another human gate. Not that your Archon workflows always have to have a ton of human in the loop, but I specifically wanted
93:18
Speaker A
to show a process that has a lot of human in the loop Because when you really want to get the most reliable results possible with your coding agents, it is pretty important to have human in the loop, at least after the plan and
93:31
Speaker A
after the implementation. Like those two places, it's really important. Not that a coding agent is going to totally botch the implementation if you don't have human in the loop.
93:40
Speaker A
But if anything, it's important just to make sure that the coding agent is really aligned with exactly what you want to build, right? Like half the time when a coding agent messes up, it's not that it's bad code, it's that it just wasn't
93:51
Speaker A
aligned with you. So that's why human in a loop is, you know, just as important as ever, even with these models and harnesses getting more and more powerful.
94:03
Speaker A
Do pre-flight Archon workflow validators exist? So yes, in a very basic sense, one thing that we have with Archon is is we have basically like a workflow linter. So after it builds an Archon workflow, there's like an Archon validate command that
94:18
Speaker A
can run to make sure that the syntax itself for the workflow is actually good.
94:23
Speaker A
But it doesn't do any kind of like smoke testing. We are planning on building a sort of eval system into Archon to help you when you first build workflows evaluate that they work as intended. How do I validate workflows after building them? There is some validate command, right?
94:43
Speaker A
I'll just show you another example of how we can just ask it. And with the Archon skill, it'll understand what it can do to do sort of like the linting of the workflow. So CLI validate workflows, validate workflows, and you can call it
94:57
Speaker A
a specific one. So like this is that you would just have Claude run or whatever coding agent run this command once it builds the workflow to make sure that the syntax just checks the structure, the validity of cycles and references and things like
95:11
Speaker A
that. Basically just like a whole laundry list of stuff that we can validate without having to actually run the workflow. Is there a workflow for creating new workflows? I tried making a couple workflows for you style without a builder
95:25
Speaker A
with mixed results. If not, do you have a tutorial on making a workflow? So we do actually, we have this Archon workflow builder.
95:36
Speaker A
I think this is exactly what you're looking for. So I would try that out.
95:40
Speaker A
And then if you have mixed results, like you shouldn't have to use this to be able to get workflows. It really just comes down to make sure you're really specific with what you're looking to build. The Archon skill by itself should be enough,
95:53
Speaker A
but yeah, you definitely have to be specific for what you're looking for. And then I don't have a specific guide to building workflows in Archon yet. I'm planning on doing some more content on that soon. What I do have though is a workshop
96:07
Speaker A
in the Dynamis community. So I did a workshop a couple weeks ago on building Archon workflows and like my best practices for that. Definitely some content will bubble into YouTube around that stuff as well soon too. All right, so where are we at
96:22
Speaker A
now? Let me go back to the web UI. Refresh, okay, so we're still going through the implementation here. So chugging along.
96:34
Speaker A
Let's see. Yeah, we're actually almost done with all of the Brownfield stuff too. Cool.
96:40
Speaker A
So yeah, we'll wait for this to be done. I'm going to take a quick restroom break. So I'll be back in like two minutes here. And we're waiting for this stuff to finish anyway. So yeah, I'm going to go ahead and just like
96:53
Speaker A
mute my microphone and turn off my camera. I'll be back in like two minutes.
98:05
Speaker A
All right, I'm back. Sorry, I had to really take care of that. All right, so yeah, still at the same place here. So I'll just keep going back to the chat with you guys here. Let's make an Archon agent that makes Archon
98:21
Speaker A
flows. You know, that's the goal of the workflow builder workflow, right? Like why build a separate agent when like Archon already allows us to build the harness for anything? That's what this is. But if you have more ideas, I'm definitely open to
98:34
Speaker A
it. Right, is there any task related tutorials anywhere besides your normal YouTube videos or live? If not, would be super helpful.
98:44
Speaker A
I'm getting stuck trying to go through all your vids to find exact steps. So yeah, I mean, Dynamis community, it definitely has more content where I go deeper on these things. Another thing that I have available if you guys wanna check it out
98:56
Speaker A
is chat.dynamis.ai. So there is a kind of like AI tutor that I've been working on. It's a little slow, because it does like really comprehensive searching, but This is an agent that I have available for you guys where if
99:12
Speaker A
you are just a subscriber of my YouTube channel or like... Sorry, let me rephrase that. By default, it will search all of my YouTube videos. So it has indexed all of my YouTube videos. You can ask it questions because I know sometimes
99:27
Speaker A
it can be hard to navigate around a lot of YouTube videos. And then if you're in the Dynamis community, This agent has the ability to search through all YouTube videos, all workshop and course content as well. And it also cites its sources and
99:41
Speaker A
can tell you exactly where to go. So for example, I asked it, what workshop does Cole cover hierarchical reg? And it points me to the exact workshop in the Dynamis community where I do that. Take a look at that. Boom. hierarchical
99:58
Speaker A
RAG for large knowledge bases. So it can give you quick answers and also point to the videos or if you're in Dynamis, the Dynamis workshops or course videos that cover that specific thing. So like for example, I asked like a really high level
100:11
Speaker A
question here, what RAG strategies does Cole cover in his content? And we got re-ranking, agentic RAG, knowledge graphs, query expansion, multi-query RAG, self-reflective RAG, contextual retrieval, context-aware hybrid chunking. I pretty much have everything at this point. Which is pretty cool
100:27
Speaker A
because one of the things I set out to do when I first made my YouTube channel is I want to cover all of the important rag strategies and chunking strategies. And I've pretty much done that at this point. I mean, there's a couple
100:38
Speaker A
of things that I haven't really covered that much like late chunking, but as far as what you actually care about or what I'd recommend, I've covered it. So anyway, this is a resource for you to check out. It's just chat.dynamist.ai for
100:53
Speaker A
you to make an account. It's just if you're not in the Dynamis community, it'll only search over my YouTube content, obviously. All right.
101:06
Speaker A
Did I drop Kimi or Minimax? No, I'm still using it for the Dark Factory.
101:11
Speaker A
It's just for my general development. Most of the time I am still using Claude code, but I have been getting pretty good results using Minimax and Kimi within the Dark Factory. Definitely not quite as reliable as Claude, but for the cost, it's pretty
101:23
Speaker A
effective. Out of curiosity, have I done direct comparison for the same scenario, Claude Code versus Codex, or rather, Opus 4.7 versus GPT 5.5? I have not yet, but that is another video that I'm potentially interested in doing. The only reason I'm hesitant to make
101:44
Speaker A
a comparison video is just because so many people have already, and it's really difficult to do comparison that actually has real substance behind it, because the problem is I can't just run one PIV loop with Opus and one with GPT. There's
101:59
Speaker A
so much variance between different runs even with the same model that it's not really a fair comparison. I'd have to do a lot of testing, but I might do that. We'll see. We'll see. Could be a good livestream too, actually.
102:13
Speaker A
What does Claude Enterprise provide? API credits or usage like Macs but for employees. So they used to have... It's sort of like a max subscription, but for employees.
102:23
Speaker A
But I think it's like really you just pay for the API credits now. It's kind of insane. It's really, really expensive. But you get direct support from Anthropic.
102:32
Speaker A
And I think you actually get more powerful models because our subscriptions with Claude are actually subsidized. Like we don't get the most powerful model when we have our individual subscriptions. And then Enterprise, I think, has access to Claude Mythos as well. So
102:47
Speaker A
it's just an entirely different model. I think those are the main things. I mean, there's probably quite a bit more, but I usually don't focus on enterprise because that's just not what I teach for most of like my YouTube and Dynamis stuff.
103:02
Speaker A
But there's also like a lot of things for security and compliance. Like one thing for, so I've done some corporate, I do actually a good amount of corporate trainings for agentic engineering and just generally like using Claude and Claude Cowork and things like
103:15
Speaker A
that. And so I have some experience with how it works. One thing that the enterprise plan gives you, it's kind of cool, is you basically have like an internal skill ecosystem. So you can have a plugin marketplace for
103:32
Speaker A
your company. And then you can publish skills. If you have the admin rights to the enterprise plan for your organization, you can publish skills and other kinds of plugins that are automatically downloaded into all the employees. So you can essentially create
103:48
Speaker A
your own context engineering framework for your company with the enterprise plan. So that part's pretty cool. Yeah, and then like all the security features and stuff.
104:02
Speaker A
But yeah, it is usage based pricing. It's really expensive. It's Claude, Claude Enterprise is so insanely expensive.
104:14
Speaker A
How can I make Claude always write commit messages below 800 characters? Well, I mean, really what it comes down to is you prompt it to write under 800 characters.
104:24
Speaker A
And then this could actually be a good use case for Archon. Basically you'd have a workflow where it's like you have it write the commit message as a markdown document. And then you have a deterministic step in the Archon workflow that validates that
104:36
Speaker A
it's actually less than 800 characters and then you have it retry the commit message if it isn't. So that would be my recommendation because it's not enough to always just ask it like it has to be under 800 characters because sometimes it doesn't
104:49
Speaker A
listen. That's the whole thing I'm talking about with Archon here is we need to be reliable by taking the decision away from the coding agent as much as possible.
104:59
Speaker A
So in this case, checking if is it really less than 800, like doing a deterministic check for that and having it retry if it isn't.
105:09
Speaker A
All right, let's go back to see where we're at with Claude here. Implementation's still running and then, okay, are we done with these now? I wanna see if I actually have a pull request. I guess I can look at the repository as well. So pull
105:31
Speaker A
requests. Oh yeah, we do. Okay, yeah, we got the pull requests for each one of these. This one was open eight minutes ago. So we added the provider for Opus 1 million to our implement nodes.
105:48
Speaker A
Oh, okay, so here's the problem. Three bundled workflows, they use Opus for the model, but they do not set the provider to Claude. When the user's default assistant is codex, then the DAG executor silently routes those nodes to codex. Oh, that's, okay, I
106:01
Speaker A
understand. So literally the problem that we were testing is like why the workflows were failing earlier. So, I mean, that's always the tough part with live streams is that I move kind of quickly, so I miss things that I would normally catch. But
106:12
Speaker A
that, okay, that's really good to know. So we're basically, this pull request is just fixing some workflows, really, is all it comes down to. So this is good. And so we have our pull request here. And then if we go to the conversation,
106:25
Speaker A
we also have the validation. So I guess we haven't gotten to that step yet.
106:30
Speaker A
I guess that's where we are next. So if I go back to Claude, we can take a look at that. So all three are still running. Yeah, okay, so they're all, this one just entered Archon validate. So we're doing the validation. But essentially
106:44
Speaker A
what's gonna happen here, I mean we're pretty close to being done, is it's just gonna have that separate node that does the validation at the very end of the pull request. And then it'll create a comment here if there's anything that we need
106:54
Speaker A
to address, and it'll go through that iteratively. So that's like the last loop that we have for the workflow. So that's pretty cool. And then let's see where we're at with the greenfield stuff. I don't know if I'm gonna really be
107:08
Speaker A
able to get to the end of this here. in the live stream. The problem, I mean, it's like a good and a bad thing. Like Archon workflows, at least the ones that I run, I've designed them to take a while because they are
107:19
Speaker A
fire and forget. But it also makes it a little bit tougher to demonstrate them live in a stream just because we can't always see it come to completion, just because there's so much that goes on in the workflow. But I hope it's
107:32
Speaker A
been useful for you guys just to see the general process, especially like how I have human in the loop built into the PIV loop workflow upfront.
107:44
Speaker A
Oh, the implementation is complete and ready for my review. Okay, so cool. Let's get through this here. So here's the branch, here's the work tree. I'm going to say, I want you to start the docs site of Archon in this work tree and
107:58
Speaker A
then tell me the URL so that I can validate the page myself. So... we can do a little bit of validation and obviously it did its own. Now it's our turn, right? Like let's start up the app. Let's take a look at the
108:11
Speaker A
workflow page and see if it looks good. Chat in the meantime. One thing to note, I created a workflow that uses 50% of the 50 hour limit within five minutes due to how incorrectly stacked tool usage and context, which was nuts. Issue fixed, of course. Yeah, that's good. When you first
108:37
Speaker A
create any kind of workflow, it's not that Archon is a token hog. It's that it can be easy to accidentally create workflows that do way too much. And so it's important when you first build one to always validate that and watch your usage
108:52
Speaker A
and adjust accordingly. Okay, so here's our URL. Let's go to the browser. Let's paste that in. Whoops, I didn't actually copy. Hold on. Let me copy this.
109:04
Speaker A
There we go. Okay. All right, there we go. So this is our initial page for workflows. And so we have the banner here, like I want a community submitted. Archon hasn't audited every workflow. Review source before installing. Or trust the author, right?
109:20
Speaker A
This is pretty cool. So we have the workflow, we can copy the command to install it, we have all the metadata here, we can view the YAML on GitHub.
109:30
Speaker A
There we go. Okay, this is actually, this is really good. This is a great starting point. Very cool. And so I can also ask follow up questions, right? So like as a part of the validation, I don't have to like just dig into
109:41
Speaker A
the code myself and look through every single line. I can also ask clarifying questions.
109:45
Speaker A
Like for example, Help me understand, based on the implementation here, what is the flow for a user to submit their own workflow? What does it look like in their own repo? What does it look like to create the pull request and the file
109:57
Speaker A
they need to edit? What does that JSON file look like? So I'm just kind of giving a brain dump here of all my questions, just to make sure that everything was implemented correctly. And then obviously if I wanted to open up the work
110:06
Speaker A
tree, and look at the code there, I can definitely do that as well. So you can perform a code review just like you would do with your normal coding agent, it's just within Archon you have to navigate to that work tree, but it's
110:19
Speaker A
still the same thing where you're looking at the code and the code base. We can also do a pull request review once we create that pull request. So the contributor keeps their own workflow YAML wherever they want any public GitHub repo. So looks
110:32
Speaker A
like this. And then when they fork Archon, they add one entry to the marketplace.ts.
110:37
Speaker A
And so we have the slug, author, description, source URL, blah, blah, blah. That's the entire PR. Yeah, the one thing is, okay, what if an Archon workflow is more than just the YAML? What if we have commands and skills
110:51
Speaker A
in other contexts that also needs to be passed in? Do we have support for that? So I think I actually might have just caught something that needs to be adjusted here. It might not be enough. So we'll see. Let's bring this back.
111:15
Speaker A
Pre-Claud code versus OG-Claud code for third-party models. I heard that the OG-Claud code is less efficient with third-party models. You know, I've actually had pretty good experience myself. I've been routing Claude code to both Kimi K2.6 and Minimax M2.7
111:30
Speaker A
for my Dark Factory experiment. I guess it has felt kind of slow. I don't know, but I actually haven't tried free Cloud Code before. But yeah, I've been having pretty good results just routing directly with the official Cloud Code. Okay,
111:48
Speaker A
so good. What I thought is actually a gap here. So V0 has a real gap. The current install command downloads exactly one file, the YAML. If the workflow references a custom command, that lives in a separate place, the commands folder and the author's
112:01
Speaker A
repo and never gets installed. The fix roughly in order of simplicity, convention-based directory, install, change source URL from a single file URL to a directory URL. When installing CLI fetch as a manifest or walks the directory tree, pulls down everything.
112:15
Speaker A
Yes, yes, yes. Yes. A is good. So resume the workflow with feedback. address this. Right, so I found a real gap in the implementation. Thank goodness, otherwise this would have been a pretty bad implementation. This is the last thing I really
112:37
Speaker A
feel like we need here. So now I'm giving this feedback and we're gonna go back into the piv loop workflow. Right, so we run the approve command. I guess I should probably rename this to continue because I'm giving feedback. I'm not really
112:51
Speaker A
approving, I'm saying like, okay, here's feedback that you need to address. And so now the workflow is back running, addressing that. And then once it's done, it'll come back for me to approve again. And we're getting to the end of the PIV loop
113:02
Speaker A
here because once we go with the approval, And then it's going to create the pull request. And we'll have that as the final artifact from the Archon workflow. So again, every single Archon workflow, it's always some kind of issue in GitHub or a
113:16
Speaker A
ticket in linear or Jira or whatever, that is the input. And then the output is always a pull request, that artifact for me to review. It also makes it easier because then I don't have to always like dig into the work tree code
113:28
Speaker A
base. All right. So a quick recap of what we've done for those of you who haven't been in the stream the whole time. We started by doing brownfield development with Archon. So I show what it looked like to take my agentic coding process, package it up into an
113:45
Speaker A
Archon workflow to handle a few different GitHub issues. So these could be bugs that we want to fix or new features that we want to build. So I used Archon to invoke in parallel. I handled each one of these issues in
113:58
Speaker A
parallel here. And we have a pull request created for each of them now. actually gotten to that point. Let me close out of a couple of things here. Let me go back to Archon, pull requests. We have the three pull requests created and
114:12
Speaker A
we'll see how far we've gotten in the review as well. There we go, yep, so we have our comprehensive PR review that finished 16 minutes ago. So this is like the last step of that fix GitHub issue workflow and it looks like there's
114:26
Speaker A
only one small issue. that we'd want to address. And then we ran that next step of the Archon workflow to actually fix it. And it only took three minutes because it was a single little issue we had to address. So now this
114:41
Speaker A
is pretty much ready for us to review ourselves. Now, as the human reviewer, I can just take a look at the get diffs and the PR or however I'd wanna take this forward. But Archon did its job here, going from issue and all
114:53
Speaker A
the context there, all the way to a fully validated pull request. And we'll have the same done for the other two in a little bit. And then the other thing that I've done in the stream here is I've also worked on this workflow
115:05
Speaker A
marketplace. So doing more of a green field, like let's build a new feature from scratch, going through a much more intricate planning process with my PIV loop. So we've seen Archon do exploration, create the plan, iterate on it with us, go into the
115:19
Speaker A
implementation, validate its own work, because we don't want to trust the coding agent on its first pass, and then we have also had the opportunity like we're doing live right now to iterate on the implementation. Like, hey, there's this problem that we saw,
115:33
Speaker A
let's go back into the PIV loop and resume the workflow to address our feedback and then go through its own validation again before it comes back to us. So all these workflows have taken a while, not because Archon is slow or Claude is
115:45
Speaker A
slow, but because there are a lot of steps that we have here Comprehensive on purpose because this is what gets me insane results. And so going back to what I was talking about at the start of the stream because someone had a good
115:58
Speaker A
question. I'm not 10xing my output with Archon because things are blazing fast. I'm 10xing my output because I can run things in parallel and it's so incredibly reliable. Like I just have my second brain kick off these... fire and forget tasks with Archon
116:15
Speaker A
and I go and continue my work and I come back to these pull requests that have full planning and implementation and a ton of self-validation and self-correction, by the time control is passed back to me, I really, I still do a lot of
116:28
Speaker A
validation myself, but it's pretty quick. Like it does such a good job for handling all these issues when we have such a guided and curated process that it goes through with the Archon workflow.
116:44
Speaker A
So let's go back and I can say, give me a status update. I've been spelling things so poorly this stream, but Claude doesn't care. All right, give me a status update. I'm just getting kind of lazy here, but it's fun to do this once in a while.
117:06
Speaker A
If you're impatient, just say what's going on right now and it can just quickly tell you. It's nice.
117:16
Speaker A
How does Archon relate to Beads or can it integrate with Beads? Yes, so Beads is something that you would integrate into an Archon workflow. I have not done that myself, but I actually have thought about doing that for a YouTube video.
117:34
Speaker A
That would be super cool. So basically, Beads would be the memory system that Archon uses as it goes through the implementation. If you wanted to my recommendation I know this is kind of like the easy answer, but like seriously try
117:50
Speaker A
this you'll get good results you clone the beads repository from github so you bring it into your local system and Then you load up your coding agent with the Archon skill and so you say you know load the Archon skill
118:07
Speaker A
look at the beads repository and and then help me build a workflow that integrates Beads directly for a PIV loop or whatever you want your Archon workflow to do.
118:18
Speaker A
And it can pull the context from both, like understand how to build Archon workflows, understand how Beads works, marry those two things together. And yeah, I mean, not like it'll be perfect first try, but Claude does a really good job kind of combining
118:32
Speaker A
ideas together into something that marries the two. All right. So, all right, both gonna, all right, cool. So, yep, we're almost done with our brownfield development here. And then, Structure reinstall support is done. I'm waiting for your review. Here's what changed. Okay, so
118:58
Speaker A
yes, this looks good. I'm just going to approve it now so that we can see the pull request get created. So now we're just continuing the workflow again, really getting to the end of the piv loop. So the piv loop will now finalize
119:11
Speaker A
and create the pull request. Very good. And then just for the satisfaction of it, I'll actually merge one of these pull requests as well. So I think this one would be a good one. If I was off stream, I would do
119:27
Speaker A
more of a review here, so I'm just going to pick a really simple one.
119:30
Speaker A
I think this one is pretty much done. We're running the CI right now, but just two minutes ago we fixed the things that came up in the review.
119:44
Speaker A
So like here, there's a couple of medium issues here that we addressed. And so if we go to look at the diffs for the pull request here, I'm just going to go through this really quickly. So we fixed the workflows and then we
119:59
Speaker A
made a note in the authoring workflow so this doesn't happen again. So I actually really appreciate how proactive it is here. And then we just had to update some unit tests just to make sure for the sake of regression testing. And...
120:17
Speaker A
I actually don't really know why I did this comment, but whatever. And then I think this is just like a linting thing here. Okay, good, so I'm gonna call this ready. And I really, I mean, if I wanted to run a separate review
120:28
Speaker A
and be really confident, we have, of course, Archon workflows for full pull request review as well. But a lot of that is just already built into the whole GitHub issue fix workflow, doing the code review in a separate session. And so I'm gonna
120:42
Speaker A
call this good. I'm gonna go ahead and merge this now. I guess it actually changed another file under my nose here. So the workflow is still running. Maybe I should be patient. I'm just trying to get that payoff here for the livestream. I
120:56
Speaker A
don't actually know which file it just changed. I can look at the commit history here. I made a commit one minute ago. Simplify. Okay, what did this actually do?
121:08
Speaker A
Okay, it's something super basic. Okay, I'm sorry, I'll let the workflow finish. It's almost done, I just need to be patient here. All right. Okay, so, and our PIVLOOP's done. So we have a pull request open. Let's take a look at that. Go
121:25
Speaker A
back here, pull requests. All right. So we have our marketplace workflow v0. There's a lot of files that are changed or a lot of code changes, but just because we created a brand new section of the website So if we take
121:41
Speaker A
a look This this one I'm not gonna merge live right now I definitely want to do more of a review on this later this weekend But it's looking pretty good marketplace commands So that we have like the commands to install the workflows now and
122:02
Speaker A
We have the marketplace page for the docs. Okay, yeah, this is looking really good. And then we have the updates to the contributing.md as well. So if you want to contribute a workflow to the marketplace, you just have your coding agent read this file so it can guide you
122:23
Speaker A
through exactly the publishing and PR process. So pretty cool. let's go back and see where we're at now i just want to merge one of these pull requests here to end off this stream all right um last one in flight is 1607 okay so
122:46
Speaker A
i think we're good to merge the other one now which was uh this one yeah yeah okay so this one this one's good to merge This is the one we were looking at just like a couple minutes ago. So I'm gonna go
122:57
Speaker A
ahead and merge this now. So it says the pull request is still a work in progress. I'm gonna say it's ready for review. And then I could do my review kinda like I did already. So I'm just gonna say this
123:11
Speaker A
is able to merge, squash and merge. And boom, there we go. All right, so we shipped a full pull request, at least one. And obviously things are a lot slower here because I'm really taking my time explaining things and I could have done
123:22
Speaker A
a lot more in parallel if I wanted. Like within a two hour window, I can realistically ship a dozen pull requests if I have a lot going in parallel and I'm not spending a lot of time talking to you guys in the chat,
123:34
Speaker A
which of course I want to in a live stream, but yeah, there we go.
123:38
Speaker A
So, and then I'll just take care of the other ones this weekend here. get these merged and then get the other parts of the workflow marketplace built out. I also have to build that Archon workflow that I'm going to have run automatically each
123:51
Speaker A
time someone creates a pull request to add a workflow. So it'll do the review and merge automatically. So that way there won't be a ton of maintainer burden on me and Rasmus and Thomas to review all the workflows because that's why we have
124:03
Speaker A
the banner here saying that we as maintainers haven't audited every workflow. I think that's fair. You just have to trust the author like us or whoever is sharing or do the review of the workflow yourself before you actually use it. I would
124:19
Speaker A
highly recommend. All right. Cool. So yeah, I think that's a good place to end the live stream here. It's been a solid, a little bit over two hour stream. I know there are a couple of questions that are still in the chat here. I try to get through
124:38
Speaker A
pretty much everything. But yeah, I'm gonna go ahead and end the stream. For any questions I didn't get to, feel free to put a comment on a YouTube video, join the Dynamis community because I'm in the Dynamis community every single day.
124:54
Speaker A
And so always answering questions. We got daily events, a weekly workshop that I do.
124:58
Speaker A
And we have three courses in the Dynamis community now. We have the AI Agent Mastery course, Agentic Coding course, and now the new Second Brain Bootcamp. So four hours for you to build your own AI Second Brain. I personally use mine literally to
125:12
Speaker A
save 20 hours a week. I'm not exaggerating. Like I've done the math before. And the other thing I just want to call us out really quick before I end the stream here is that I have an anniversary special. This is the last day
125:25
Speaker A
for the anniversary special. A discount here because last week we celebrated the one-year anniversary of Dynamis. It still feels unreal that I get to say that.
125:35
Speaker A
We've been running this community for over a year now, growing it nicely for a year. So yeah, this is the place to be if you want to learn more about using Archon, Level up your agentic engineering. And yeah, there's just, you know, there's
125:49
Speaker A
over 1,400 people in the community now all sharing a lot of stuff around building second brains and AI coding and all the events that I got going on. So, to see you in Dynamis. Otherwise, I appreciate you guys being a part of this
126:02
Speaker A
live stream here, building out with Archon, both brownfield and sort of greenfield development. I hope it was useful just to see like how I use my second brain with Archon and how I'm using different workflows to handle issues and basically like just going
126:15
Speaker A
through the concept of like issue in, pull request out, showing what that looks like also with the PIV loop for more human in the loop. So yeah, really I'm using Archon as my daily driver. I'm not doing any AI coding without it now.
126:28
Speaker A
And so yeah, it's just my biggest passion project right now. And I'm just gonna keep building and keep building and doing a lot more live streams as I do that. So again, my new schedule for live streams that I'm gonna try out, at
126:40
Speaker A
least for the next couple of weeks here and probably beyond that, is every Monday, Thursday and Saturday at 9 a.m. Central Time, starting with the one today. So appreciate all of you guys being here. Hope that you all have a fantastic rest of
126:53
Speaker A
your weekend and I will see
Topics:AI coding workflowArchonlive codingbrownfield developmentgreenfield developmentworkflow marketplacecoding agentsoftware developmentproductivityClaude AI

Frequently Asked Questions

What is Archon and how does it improve AI coding workflows?

Archon is a tool that enables custom AI coding agents to assist in software development, improving productivity by automating coding tasks and enabling parallel work on multiple issues.

What is the difference between brownfield and greenfield development in this context?

Brownfield development refers to working on existing codebases and issues, while greenfield development involves building new features or components from scratch using Archon.

How does the proposed workflow marketplace for Archon work?

The workflow marketplace allows users to bundle their Archon workflows as assets hosted in repositories, which others can install easily, similar to an NPM package or skills registry.

Get More with the Söz AI App

Transcribe recordings, audio files, and YouTube videos — with AI summaries, speaker detection, and unlimited transcriptions.

Or transcribe another YouTube video here →