Available Cloud LLMs

Find reference information and an up-to-date (January February 25th, 2025) of cloud LLMs available for use that are currently supported by PiecesOS, the Pieces Desktop App, and other Pieces plugins and extensions.


Supported LLMs

The Pieces for Developers Suite currently supports cloud models from a range of providers.


Provider

Model Name

OpenAI

GPT-X

Anthropic

Claude / Sonnet / Opus / Haiku

Google

Gemini / Pro / Flash / Chat

View the tables below for detailed model names, parameters, and the context window size of all usable models.


Please note that not all specific models have easily indentifiable parameter quantities. Some companies release information on their models, while others do not—as such, the parameters provided in these tables are estimated parameter ranges based on leading AI sources, detailed evaluations and assessments, and other available information.


OpenAI


Model Name

Parameters

Context Window (Maximum)

GPT-4o Mini

8b

128k tokens

GPT-4o

N/A

128k tokens

GPT-4 Turbo

N/A

128k tokens

GPT-4

N/A

8k tokens

GPT-3.5

N/A

4k tokens

Anthropic


Model Name

Parameters

Context Window (Maximum)

Claude 3.5 Sonnet

175b

40k tokens

Claude 3 Sonnet

100b

40k tokens

Claude 3 Opus

150b

40k tokens

Claude 3 Haiku

N/A

40k tokens

Google


Model Name

Parameters

Context Window (Maximum)

Gemini Pro Chat

8b

4k tokens

Gemini 2 Flash

30b

1m tokens

Gemini 1.5 Pro

45b

128k tokens

Gemini 1.5 Flash

80b

256k tokens

Updated on