Initially, I didn’t plan to jump on the OpenClaw bandwagon, as I didn’t seem to have a specific use case for it. However, I suddenly received a task from company leadership: I must deploy a set of OpenClaw internally. The goal is to hire a few free digital employees and get the most out of them. So, I had to give it a try. I installed and deployed a Tencent version of OpenClaw called WorkBuddy this morning. It felt pretty good, but it couldn’t fulfill some of my requirements, such as:

So, I thought I’d deploy a pure-blooded version of OpenClaw myself to see if it could meet my needs. Plus, I discovered that OpenClaw can also integrate with GitHub Copilot, using a free model with unlimited tokens, which made it even more attractive.
Installing OpenClaw
Refer to the official OpenClaw website:
https://openclaw.ai
A single command is all it takes to install. For example, on Windows, open PowerShell and enter:
iwr -useb https://openclaw.ai/install.ps1 | iex
iwr – that’s a command I’m seeing for the first time.
The
iwrcommand, short for Invoke-WebRequest, is a built-in command in PowerShell. It’s similar to thecurlorwgetcommands in Linux.
Integrating GitHub Copilot
After installation, there’s a configuration guide:
Under Model/auth provider, select Copilot

Then, for Copilot auth method, choose GitHub Copilot (GitHub device login)

This device login is the same as when installing the GitHub Copilot CLI previously. It generates an authorization code locally, and then you go to the GitHub website to enter the code to complete the authorization. It’s very convenient. However, be careful not to do this in WSL1 (it has a bug); it works fine in PowerShell. Refer to the previously compiled document: Unlimited Free Large Model Tokens, GitHub Copilot CLI SDK Installation and Testing. Furthermore, I suspect that installing the GitHub Copilot CLI is still necessary because the Copilot CLI SDK previously also required the GitHub Copilot CLI to be installed. The reason is that the OpenClaw code most likely uses the SDK to call the Copilot CLI’s server mode to utilize Copilot’s capabilities. The SDK handles communication with the CLI, and the CLI handles communication with Copilot’s backend services. The flow is as follows:
OpenClaw
↓
SDK Client
↓ JSON-RPC
Copilot CLI (server mode)
↓
GitHub Copilot Backend
Although it’s just a guess, this is highly probable.
Model Selection: gpt-4o or gpt-5-mini?
These two models are free and unlimited within GitHub Copilot. I definitely don’t want to use a paid model in OpenClaw; who knows if it would directly waste the 300 free requests per month limit.
I’m not really familiar with the specific differences between these two models. I asked Gemini, and it recommended gpt-5-mini. The reason given was:
The core pain points for tools like OpenClaw are handling dynamic web pages and anti-scraping measures. GPT-5 mini has specific optimizations for visual reasoning.
- GPT-4o: Can sometimes fail to distinguish between overlapping DOM elements or complex CSS layouts.
- GPT-5 mini: Has higher accuracy in recognizing contexts for Canvas rendering, complex iframe nesting, and CAPTCHAs.
Although I can’t verify the truth of this, my experience with gpt-5 mini has been quite good. No major issues.
Choosing the Chat Application
Initially, I configured Feishu (Lark), but I couldn’t get it to work. The issue was that the robot in Feishu didn’t show a chat input box… Maybe I misconfigured something somewhere. Anyway, I didn’t want to dwell on it. So, I switched to WeChat Work (Enterprise WeChat).
The reason for choosing WeChat Work is that this morning I used the Tencent version of OpenClaw, WorkBuddy, which integrated with a WeChat Work robot. The process was much simpler and had no pitfalls. Feishu’s configuration is still too complex and involves many permissions; maybe it’s more powerful and flexible. But I only need simple chat functionality, and WeChat Work is sufficient. For specific configuration, refer to the official WeChat Work documentation for OpenClaw; it’s very detailed:
https://open.work.weixin.qq.com/help2/pc/cat?doc_id=21657
Testing the Results
Haha, after completing the configuration, I could fully experience OpenClaw’s capabilities.

I feel that it’s still quite a hassle. If there are no special requirements, it’s better to just use the Tencent version, WorkBuddy. Moreover, its built-in free model has web search capabilities, saving you the trouble of configuring OpenClaw yourself and worrying about costs. Although WorkBuddy will likely charge a fee sooner or later.
About the Author 🌱
I am a developer from Yantai, Shandong, China. If you have any interesting topics or software development needs, feel free to email me at: zhongwei.sun2008@gmail.com for a chat, or follow my personal public account "Elephant Tools", See more contact information