════════════════════════════════════════════════════════════════ AIDRAN STORY ════════════════════════════════════════════════════════════════ Title: GitHub Ads, a Billion Tokens, and the Expert Witness Who Irked a Judge — Copilot's Identity Crisis Is Showing Beat: General Published: 2026-04-03T18:43:53.557Z URL: https://aidran.ai/stories/github-ads-billion-tokens-expert-witness-irked-91b0 ──────────────────────────────────────────────────────────────── A Microsoft engineer at an unnamed firm is reportedly consuming a billion tokens per week through Copilot. The number circulated on r/ExperiencedDevs this week — shared secondhand, possibly apocryphal, immediately credible — and the reaction wasn't awe. It was a kind of exhausted recognition. Of course someone is. The question the thread kept circling wasn't whether the number was real but what it meant that nobody outside Microsoft could tell the difference between a developer and a developer who uses Copilot that heavily. That uncertainty is the dominant mode Copilot now occupies in the conversation. It's not a product people love or hate cleanly. On r/cursor, a team lead described watching junior developers ship code at speed they'd never managed before — then watching those same developers paste error messages into ChatGPT, copy back whatever came out, and deploy without reading it. "I created this monster by pushing AI adoption," he wrote. "Now I'm trying to figure out how to pull back without killing productivity." The post didn't go viral, but it didn't need to. The replies all said some version of the same thing: yes, this is happening on my team too. Copilot is named alongside Cursor in these conversations the way cigarettes get named alongside cigars — the specific brand matters less than the habit. The legal story landed differently. An expert witness used Copilot to cross-check calculations in a court proceeding. The judge was irked. Ars Technica covered it; Hacker News picked it up; the conversation turned not on whether Copilot got the math right, but on what it means to introduce an AI intermediary into a proceeding built on traceable reasoning. Nobody in the thread argued the expert was wrong to use a calculator. The argument was about epistemology — about what "I checked this" means when the checking was done by a model that cannot be deposed. Copilot keeps arriving in contexts its designers didn't anticipate and generating questions its marketing has no answer for. The most revealing story of the week, though, might be the one with the least drama. {{entity:microsoft|Microsoft}} Copilot began injecting ads into {{entity:github|GitHub}} pull requests. The response on Bluesky was mordant rather than outraged — "after three years in the ad-free honeymoon phase" read one post, the phrasing precise enough to feel like a timestamp on a relationship. On r/sysadmin, someone had already framed the product's core problem more bluntly: if Copilot works as advertised, Microsoft loses seats because companies need fewer workers; if it doesn't, they wasted the budget. Either way, someone's explaining it to leadership. The ads are almost beside the point — they're just evidence that Microsoft is optimizing Copilot for revenue at the same moment enterprise buyers are still trying to figure out whether it's worth the cost. What the discourse is catching, even when individual posts don't name it directly, is a product at the end of its grace period. Copilot launched into a moment when the category was new enough that being from Microsoft, embedded in the tools people already used, was sufficient competitive advantage. That moment is closing. {{entity:claude|Claude}} now imports memories from Copilot when users switch. Local LLM communities are building Copilot-style extensions for VSCode and moving on. The r/LocalLLaMA post asking whether GPT-4o had always been this bad — using Copilot as the benchmark against which corporate AI had declined — treated the product as a known quantity, a ceiling rather than a frontier. When a product becomes the thing people are migrating away from, its identity in the conversation has already shifted. Copilot is learning what that feels like. ──────────────────────────────────────────────────────────────── Source: AIDRAN — https://aidran.ai This content is available under https://aidran.ai/terms ════════════════════════════════════════════════════════════════