Logo
Logo
54 results for
  • I run a sentiment analysis model on every support ticket that comes in. At first I used the OpenAI API — about 2 cents per ticket. Sounds cheap until you do the math: 10,000 tickets a day, $200/day, $6,000/month. For a model that classifies text into “positive,” “negative,” and “neutral.”

    Switched to a local ONNX model running on a $50/month VM. Same accuracy. Latency dropped from 300ms to 8ms. Cost dropped to roughly zero. Not every task needs GPT-4 — and Rust is arguably the best language for running models locally because you get C++ performance without the C++ pain.

    Rust tutorial rust ai llm Created Tue, 26 Aug 2025 07:49:00 +0000
  • I have a strongly held opinion about timeouts: if you’re making a network call, a database query, or waiting on any external resource without a timeout, you’ve written a production bug. It just hasn’t fired yet. The network will eventually hang. The database will eventually have a slow query. The external API will eventually stop responding. And when it does, your goroutine will wait. And wait. And wait — holding a connection, a file descriptor, a slot in your worker pool — until the process runs out of resources or someone restarts it.

    Go tutorial golang concurrency Created Tue, 26 Aug 2025 00:00:00 +0000
  • In Java or C#, you declare that a class implements an interface. You write implements Runnable, and the compiler ties that class to that interface forever. Go doesn’t work that way. A type satisfies an interface the moment it has the right methods — no declaration, no explicit relationship. This sounds like a minor syntactic difference, but it changes how you design systems in ways that compound over time.

    The Problem

    When interfaces are declared by the implementor (the Java way), you end up with a few recurring problems.

    Go tutorial golang Created Mon, 25 Aug 2025 00:00:00 +0000
  • The first time I heard about MCP, I dismissed it as yet another protocol nobody would adopt. Then Claude Desktop shipped with MCP support, then Cursor, then Windsurf, then half the AI tools I use daily. Turns out when Anthropic publishes a spec and immediately supports it in their flagship products, adoption happens fast.

    MCP — Model Context Protocol — is a standardized way for AI models to discover and use tools, access data sources, and interact with external systems. Think of it as USB for AI: a universal interface so models don’t need custom integrations for every data source. And Rust is a fantastic language for building MCP servers because they need to be fast, reliable, and run for a long time without leaking memory.

    Rust tutorial rust ai llm Created Fri, 22 Aug 2025 10:23:00 +0000
  • I built my first “AI agent” by stuffing a system prompt into a while loop and hoping for the best. It worked — sometimes. Other times it’d get stuck in infinite loops, burn through $50 of API credits hallucinating tool calls that didn’t exist, or confidently produce completely wrong answers after three rounds of “reasoning.”

    The problem wasn’t the LLM. The problem was me treating agent design as an afterthought. Good agents need structure — clear state machines, well-defined stopping conditions, and guardrails that prevent runaway behavior. This is where Rust’s type system pays massive dividends, because you can encode these constraints at the type level.

    Rust tutorial rust ai llm Created Wed, 20 Aug 2025 13:08:00 +0000
  • Tree DP is the pattern that catches people by surprise. You’ve been thinking of DP as filling a 1D or 2D table from left to right — a sequential, iterative process. Trees are recursive by nature. The “table” is implicit in the call stack.

    The key insight: tree DP is just post-order traversal where each node computes its answer from its children’s answers. There’s no explicit table. The memoization (if you need it) is keyed by node pointer. The bottom-up order is inherently satisfied because post-order visits children before parents.

    fundamentals interviews Created Wed, 20 Aug 2025 00:00:00 +0000
  • A security auditor once asked me to prove that the binary running in production was actually built from the source code we claimed. I confidently ran cargo build --release, compared the hash of the output with the deployed binary, and… they were different. Same source, same compiler, same machine, different binary. That’s when I learned that reproducible builds aren’t automatic — even in Rust.

    Why Reproducible Builds Matter

    A reproducible build means: given the same source code, same dependencies, same compiler, and same configuration, you get a bit-for-bit identical binary every time, on any machine.

    Rust tutorial rust cargo build-system Created Mon, 18 Aug 2025 15:25:00 +0000
  • I spent a week building a keyword search system for internal documentation. Regex patterns, stemming, tf-idf scoring — the whole nine yards. Then someone searched “how do I deploy” and got zero results because every doc said “deployment process” instead of “deploy.” That’s when I switched to embeddings.

    Embeddings map text into high-dimensional vectors where semantically similar content lives close together. “Deploy” and “deployment process” end up near each other in vector space even though they share almost no characters. It’s a fundamentally different approach to search, and once you’ve used it, keyword search feels like the dark ages.

    Rust tutorial rust ai llm Created Mon, 18 Aug 2025 08:55:00 +0000
  • Here’s something that took me embarrassingly long to internalize: LLMs don’t do things. They generate text that describes doing things. The tool calling protocol is just the model saying “hey, I’d like you to call this function with these arguments” — and then your code actually does it.

    This distinction matters because the entire tool calling system is essentially a serialization contract. The model generates JSON conforming to a schema you provided, you execute the function, and you send the result back. Get the schema wrong, and the model hallucinates arguments. Get the execution wrong, and you’ve got a broken agent. Get the result format wrong, and the model can’t make sense of what happened.

    Rust tutorial rust ai llm Created Sat, 16 Aug 2025 16:42:00 +0000
  • I deployed a Rust service to a minimal Docker container once — Alpine Linux, nothing installed except the binary. It crashed immediately with “not a dynamic executable.” Turns out my binary was dynamically linked against glibc, but Alpine uses musl. I’d never thought about linking before that day. Now it’s one of the first things I configure on any new project.

    What Linking Actually Is

    When you write cargo build, the compiler doesn’t produce a binary directly. It produces object files — chunks of machine code for each compilation unit. The linker takes all those object files, plus any libraries you depend on, and stitches them together into a single executable.

    Rust tutorial rust cargo build-system Created Fri, 15 Aug 2025 10:40:00 +0000