72 contributors. That's how many people shipped OpenClaw v2026.2.17 on February 17, and the release notes read like a wishlist that actually got built.
The headline feature is 1M token context windows for Anthropic models. But honestly, the Slack streaming change might be the one you'll notice most day-to-day.
OpenClaw 1 Million Token Context Window
You can now pass params.context1m: true and OpenClaw sends the anthropic-beta: context-1m-2025-08-07 header with your requests. This works with Claude Opus 4.6 and the brand-new Sonnet 4.6. It fixes GitHub issue #11292 where the beta header was not being sent correctly, which means some users thought the feature was broken when it was actually a config bug.
One catch: this only works on Tier 4 pay-as-you-go API access. Pro and Max plans are excluded. And anything above 200K tokens gets billed at premium long-context rates, so keep an eye on your token budget.
OpenClaw Sonnet 4.6 Support
Anthropic released Sonnet 4.6 the same day, and OpenClaw shipped support immediately. The numbers are worth paying attention to: 72.5% on OSWorld (up from 61.4%), 79.6% on SWE-bench Verified, and users preferred it over Sonnet 4.5 about 70% of the time. At $3/$15 per million tokens compared to Opus 4.6's $5/$25, it's hard to argue against trying it.
OpenClaw added a forward-compatibility fallback too. If your provider doesn't support Sonnet 4.6 yet, the system gracefully drops to the previous model instead of erroring out. That sounds simple but it was non-trivial engineering according to the maintainers.
OpenClaw Slack Streaming
Before this update, Slack responses arrived as a wall of text after the model finished generating. Now OpenClaw uses Slack's native chat.startStream / appendStream / stopStream API for token-by-token streaming. It's enabled by default and falls back gracefully if your Slack workspace doesn't support it yet.
Small change on paper. Big difference when you're waiting for a 2,000-word response.
Security Patch: OC-09
This one matters. OC-09 was a credential theft vulnerability through environment variable injection, plus $include path traversal hardening. If you're on ClawHosters, your instance was patched automatically on day one. No action needed.
iOS Talk Mode and Subagents
Two smaller additions worth mentioning. iOS Talk Mode got three fixes:
Background listening toggle (off by default for battery)
Barge-in hardening so the bot doesn't interrupt itself through the speaker
Voice directive hint toggle that can save tokens
There's also a new /subagents spawn command for easier multi-agent workflows. Check the docs for setup details.
Full details in the release changelog and Simon Willison's breakdown.