Back

The Prompt API

12 points2 hoursdeveloper.chrome.com
avaer40 minutes ago

It works, I've shipped this as a "local inference"/poor person's ollama for low-end llm tasks like search. The main win is that it's free and privacy preserving, and (mostly) transparent to users in that they don't have to do anything, which is great for giving non-technical users local inference without making them do scary native things.

But keep in mind the actual experience for users is not great; the model download is orders of magnitude greater than downloading the browser itself, and something that needs to happen before you get your first token back. That's unfixable until operating systems start reliably shipping their own prebaked models that an API like this could plug into.

skybrian43 minutes ago

Still in origin trial? Looks like they're adding a temperature parameter:

https://chromestatus.com/feature/6325545693478912

benatkin38 minutes ago

I like how all five examples are stuff I don't want. The only one where I see some utility is asking questions about a page, but I would want to do that with a frontier model, not Gemini Nano. Way to go, Chrome.

fg13744 minutes ago

"sorry, to use our website, you must have at least 22 GB of free disk space."

iggerews34 minutes ago

[dead]

iggerews35 minutes ago

[dead]