Below are a few of the latest posts in my blog. You can see a full list by year to the left.
So I shipped it. It's at https://github.com/odellus/crow-cli It's got compaction. I'm working on adding orchestration through an agent-client that's an Agent Server on the stdio streets and an Agent Client overn the websocket sheets to the downstream agents it is handing the end_turn responses from. This is how I'm setting up to
So I really really really need to start waking up and getting first in the shower and then taking a long hot bath because it's basically a free workout and THEN I can go do my actual morning workout, which I do not have. I'm thinking that I should try to
:::{figure} ../images/crow-logo-crop.png ::: I started playing around with [logseq][logseq] because it seems very similar to obsidian. I also noticed that logseq has a built-in calendar view, which is very useful for planning and organizing tasks. I'm excited to see how logseq can help me manage my notes and tasks more efficiently. I used
So I've decided to just not really give up on building a bespoke coding agent framework but kinda sorta yeah give up on building a production-ready python coding framework in favor of using [`openhands-sdk`][openhands-sdk]. Their current CLI is rather opinionated as CLIs are wont to be but I've got something
Back on my bullshit. I don't really feel like giving the whole rundown but I'll just give a current snapshot of where I'm at today. `karla` is a letta agent that I've created in python that has most of the capabilities of letta-code. I built it to be a CLI agent and
Been thinking more and more like I should probably look at maybe just using trae-agent. The damn thing worked well. That's how I started and then I go back to using the IDE because it's easy and I can interrupt but like I started using open source coding agents for
I bought a smartwatch. I'd always bought the super cheap bands on amazon but the last one I bought didn't actually work so if you try to save too much money you just end up wasting it. I went with the Galaxy 7 smartwatch. I've got plans to set up an
I went down a little bit of a rabbit hole in 2025 on learning new languages for coding because with the coding agents you can write in pretty much whatever language makes the most sense for the problem versus whatever it is you personally know. So I'm trying to not be
I started using the elliptical and I've been talking the dogs for at least two walks a day but I enjoy using the elliptical and getting sweaty just for the hell of it honestly. It feels good to move and exercise. I have been talking to fucking LLMs way too god
I ported the agent backend of [opencode.ai](https://opencode.ai) to rust and I couldn't figure out how I wanted to put together the frontend. I started with dioxus, realized doing web editor with syntax highlighting in the browser with rust is like putting a man on the moon, and decided I needed
:::{figure} ../images/amd-ryzen-ai-max-plus-395.jpg This thing is making me fucking work too damn hard ::: All of this work is fundamentally about **building a self-improving AI development ecosystem**. I'm not just keeping this thing running like a top - I'm creating an autonomous system that can process a massive backlog of projects, execute them with
I spent this weekend [building][julia-agent-exploration] something I've been thinking about for a while: a Julia-native alternative to Python's LangGraph framework. This wasn't about porting LangGraph to Julia - it was about creating something that leverages Julia's unique strengths while maintaining the core patterns that make LangGraph useful. LangGraph has some great
I've been investigating the Zed codebase to understand how their agent system works, and I discovered a fascinating architectural pattern that's worth documenting. If you've ever wondered how Zed's AI chat system works under the hood, this post is for you. The key discovery is that **`agent2` is indeed a wrapper
A friend of mine on bluesky asked me to write up my notes on how to add web_search tools to their copilot through an MCP server and searxng running locally, which I've got set up and use as my local search engine by default in the browser. But yeah anyway here's
So that's kind of what I'm thinking about right now. If I start to develop a new agent in copilotkit or if I should try to integrate with the IDE I actually use for work. Saying "I'm going to build my own IDE" sounds a lot more trivial when I'm
My plan of attack: 1. Replicate main functionality of tools in deepagents with equivalent MCP tool for DSPy 2. Create DSPy server with history and postgres persistence/memory 3. Endpoints for streaming and for no streaming that emit hooks for frontend 4. Discover how frontend is working, how it is showing everything in langgraph 5. There
So I've really gotten into julia over the past few days. Friday morning I sat down with trae-agent and zed and wrote a [whole little agent][openai-tool-calling-jl]. And now I've got the bug. It started with "I'm going to sit down and create my own programming language with MLIR because I
:::{tip} My version of plan-execute This comes from [plan-execute][plan-execute] tutorial on langgraph's doc site. I have put it in its [own repo][my-plan-execute] too ::: First we have a bunch of imports ```python from pydantic import BaseModel, Field import operator from typing import Annotated, List, Tuple, Union, Literal from typing_extensions import TypedDict import asyncio from langchain_community.tools.searx_search.tool import SearxSearchResults from langchain_community.utilities import
So it feels like I can post with high frequency or with high quality but not with both. I guess that's why I haven't done much with my book(s) I want to write. I'm content to just vomit status updates into the void. I still want to revisit the idea of
date: "2025-07-28" title: "Gauss Law" [gauss-law](#gauss-law) What do you know? :::{important} Deep research article inbound ::: Great. I’ll gather detailed, connected insights from the past three months on recent academic advances in: * Single-cell omics (especially spatial omics), * Computational systems biology, * Synthetic developmental biology, …with a focus on work from HHMI labs and other leading groups (like the Whale
Here are the steps I finally figured out for getting my account's service endpoint changed from advanced-eschatonics.com to pds.advanced-eschatonics.com. I was inspired by whtwnd and how they use my pds to log me in. I also am the head of AI Rearch and Development at a startup and a lot
I guess I absolutely need to get the mystmd system set up because I have been microblogging all week. I even used freaking whtwnd and it's plain jane vanilla markdown smdh. Yeah that's it. The fork of whtwnd must continue. Let's revisit this real quick and take stock of what we've already
I don't know. What did I do this week? Feels strange like the sixth was more than five days ago. The Sim2Real failed because the Real2Sim step wasn't accurate enough. I started putting the meshes that make up the robot base and camera mounts together with the mesh for the base of
So I spent a lot of time this weekend trying to align the simulation camera with the actual camera, even going so far as to come up with multiple approaches to solve this problem with varying degrees of math and automation. The quick math one worked well enough that I