chore(repo): initialize planarchy workspace
This commit is contained in:
@@ -0,0 +1,87 @@
|
||||
# Azure OpenAI / Codex CLI Setup
|
||||
|
||||
This document collects the steps you need when working with the **Codex CLI** and the
|
||||
Azure OpenAI service. The examples assume that you have two deployments available in
|
||||
your Azure OpenAI project:
|
||||
|
||||
* `gpt-5.4`
|
||||
* `gpt-5.4-pro`
|
||||
|
||||
Both of these models are currently active in our environment and can be referenced from
|
||||
the CLI configuration.
|
||||
|
||||
---
|
||||
|
||||
## 1. Prerequisites
|
||||
|
||||
1. **Azure Subscription** – an active subscription with access to the Azure OpenAI
|
||||
resource.
|
||||
2. **Model Deployment** – make sure at least one of the Codex-compatible models has
|
||||
been deployed. In our case the deployments are named `gpt-5.4` and
|
||||
`gpt-5.4-pro`.
|
||||
3. **Endpoint & key** – note down the endpoint URL and the API key from the Azure
|
||||
portal; you will need them for the configuration file.
|
||||
4. **Supported OS** – macOS 12+, Ubuntu 20.04+, or Windows 11 via WSL2.
|
||||
5. **Tools** – Node.js + npm installed on your machine.
|
||||
|
||||
---
|
||||
|
||||
## 2. Install and configure the Codex CLI
|
||||
|
||||
```bash
|
||||
npm install -g @openai/codex
|
||||
```
|
||||
|
||||
Create a configuration folder in your home directory:
|
||||
|
||||
```bash
|
||||
mkdir -p ~/.codex
|
||||
```
|
||||
|
||||
and then create a `config.toml` inside that folder with the following content
|
||||
(replace the placeholders with your actual values):
|
||||
|
||||
```toml
|
||||
# ~/.codex/config.toml
|
||||
|
||||
# default model to use; switch between "gpt-5.4" and "gpt-5.4-pro" as needed
|
||||
model = "gpt-5.4"
|
||||
model_provider = "azure"
|
||||
|
||||
[model_providers.azure]
|
||||
name = "Azure OpenAI"
|
||||
base_url = "https://<your-resource-name>.openai.azure.com/"
|
||||
env_key = "AZURE_OPENAI_API_KEY"
|
||||
```
|
||||
|
||||
The `model` field can be changed at any time to `gpt-5.4-pro` when you want the
|
||||
higher‑tier model; the CLI reads this file on each invocation.
|
||||
|
||||
Finally, export your API key into the environment:
|
||||
|
||||
```bash
|
||||
export AZURE_OPENAI_API_KEY="<your-key-here>"
|
||||
```
|
||||
|
||||
On Windows (PowerShell) use:
|
||||
|
||||
```powershell
|
||||
setx AZURE_OPENAI_API_KEY "<your-key-here>"
|
||||
```
|
||||
|
||||
or set it in the System environment variables via the Control Panel.
|
||||
|
||||
---
|
||||
|
||||
With that in place you can run `codex --help` and start using the CLI against your
|
||||
Azure deployment.
|
||||
|
||||
*Note:* the CLI will automatically pick the model specified in `config.toml`.
|
||||
To switch models you may either edit the file or pass `--model gpt-5.4-pro` on
|
||||
the command line.
|
||||
|
||||
---
|
||||
|
||||
This repository does not actually contain the `.codex` folder – it lives in your
|
||||
home directory – but the sample file above is provided for reference. You can
|
||||
copy it into your own environment when you set up the CLI.
|
||||
Reference in New Issue
Block a user