OpenAI TypeScript and JavaScript API Library
OpenAI TypeScript and JavaScript API Library
This library provides convenient access to the OpenAI REST API from TypeScript or
JavaScript.
To learn how to use the OpenAI API, check out our API Reference and Documentation.
Installation
npm install openai
Installation from JSR
deno add jsr:@openai/openai
npx jsr add @openai/openai
These commands will make the module importable from the @openai/openai scope:
You can also import directly from JSR without an install step if you're using the
Deno JavaScript runtime:
main();
Streaming responses
We provide support for streaming responses using Server Sent Events (SSE).
main();
If you need to cancel a stream, you can break from the loop or call
stream.controller.abort().
// or, equivalently:
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
main();
See helpers.md for more details.
main();
Documentation for each method, request param, and response field are available in
docstrings and will appear on hover in most modern editors.
File uploads
Request parameters that correspond to file uploads can be passed in many different
forms:
File (or an object with the same structure)
a fetch Response (or an object with the same structure)
an fs.ReadStream
the return value of our toFile helper
import fs from 'fs';
import fetch from 'node-fetch';
import OpenAI, { toFile } from 'openai';
// Or if you have the web `File` API you can pass a `File` instance:
await client.files.create({ file: new File(['my bytes'], 'input.jsonl'), purpose:
'fine-tune' });
// Finally, if none of the above are convenient, you can use our `toFile` helper:
await client.files.create({
file: await toFile(Buffer.from('my bytes'), 'input.jsonl'),
purpose: 'fine-tune',
});
await client.files.create({
file: await toFile(new Uint8Array([0, 1, 2]), 'input.jsonl'),
purpose: 'fine-tune',
});
Handling errors
When the library is unable to connect to the API, or if the API returns a non-
success status code (i.e., 4xx or 5xx response), a subclass of APIError will be
thrown:
main();
Error codes are as followed:
// Override per-request:
await client.chat.completions.create({ messages: [{ role: 'user', content: 'How can
I list all files in a directory using Python?' }], model: 'gpt-4o' }, {
timeout: 5 * 1000,
});
On timeout, an APIConnectionTimeoutError is thrown.
Note that requests which time out will be retried twice by default.
Request IDs
For more information on debugging requests, see these docs
All object responses in the SDK provide a _request_id property which is added from
the x-request-id response header so that you can quickly log failing requests and
report them back to OpenAI.
[!IMPORTANT] The Azure API shape slightly differs from the core API shape which
means that the static types for responses / params won't always be correct.
Advanced Usage
Accessing raw Response data (e.g., headers)
The "raw" Response returned by fetch() can be accessed through the .asResponse()
method on the APIPromise type that all methods return.
You can also use the .withResponse() method to get the raw Response along with the
parsed data.
Undocumented endpoints
To make requests to undocumented endpoints, you can use client.get, client.post,
and other HTTP verbs. Options on the client, such as retries, will be respected
when making these requests.
await client.post('/some/path', {
body: { some_prop: 'foo' },
query: { some_query_arg: 'bar' },
});
Undocumented request params
To make requests using undocumented parameters, you may use // @ts-expect-error on
the undocumented parameter. This library doesn't validate at runtime that the
request matches the type, so any extra values you send will be sent as-is.
client.foo.create({
foo: 'my_param',
bar: 12,
// @ts-expect-error baz is not yet public
baz: 'undocumented option',
});
For requests with the GET verb, any extra params will be in the query, all other
requests will send the extra param in the body.
If you want to explicitly send an extra argument, you can do so with the query,
body, and headers request options.
// Tell TypeScript and the package to use the global web fetch instead of node-
fetch.
// Note, despite the name, this does not add any polyfills, but expects them to be
provided if needed.
import 'openai/shims/web';
import OpenAI from 'openai';
To do the inverse, add import "openai/shims/node" (which does import polyfills).
This can also be useful if you are getting the wrong TypeScript types for Response
(more details).
If you would like to disable or customize this behavior, for example to use the API
behind a proxy, you can pass an httpAgent which is used for all requests (be they
http or https), for example:
import http from 'http';
import { HttpsProxyAgent } from 'https-proxy-agent';
// Override per-request:
await client.models.list({
httpAgent: new http.Agent({ keepAlive: false }),
});
Semantic versioning
This package generally follows SemVer conventions, though certain backwards-
incompatible changes may be released as minor versions:
Changes that only affect static types, without breaking runtime behavior.
Changes to library internals which are technically public but not intended or
documented for external use. (Please open a GitHub issue to let us know if you are
relying on such internals.)
Changes that we do not expect to impact the vast majority of users in practice.
We take backwards-compatibility seriously and work hard to ensure you can rely on a
smooth upgrade experience.
We are keen for your feedback; please open an issue with questions, bugs, or
suggestions.
Requirements
TypeScript >= 4.5 is supported.
Cloudflare Workers.
Jest 28 or greater with the "node" environment ("jsdom" is not supported at this
time).
Web browsers: disabled by default to avoid exposing your secret API credentials.
Enable browser support by explicitly setting dangerouslyAllowBrowser to true'.
More explanation
Why is this dangerous?
Enabling the dangerouslyAllowBrowser option can be dangerous because it exposes
your secret API credentials in the client-side code. Web browsers are inherently
less secure than server environments, any user with access to the browser can
potentially inspect, extract, and misuse these credentials. This could lead to
unauthorized access using your credentials and potentially compromise sensitive
data or functionality.
When might this not be dangerous?
In certain scenarios where enabling browser support might not pose significant
risks:
If you are interested in other runtime environments, please open or upvote an issue
on GitHub.