I’m trying to get Gemini to analyze a simple receipt image using the Laravel Http facade, but I keep hitting a 400 “Invalid JSON payload” error.
The request works fine for text-only prompts, but adding the image part breaks it. Here is the snippet:
$res = Http::post($url, [
'contents' => [[
'parts' => [
['text' => 'What is the total?'],
['inline_data' => [
'mime_type' => 'image/jpeg',
'data' => base64_encode(file_get_contents($path))
]]
]
]]
]);
Google returns: Unknown name "inline_data": Field 'inline_data' could not be found in request messages.
I’ve triple-checked the nesting based on the docs. Is this an array structure issue, or am I forced to use the /upload/ File API even for small 150kb images?
Any ideas?
The error is a camelCase/snake_case mismatch. The Gemini REST API uses proto3 JSON mapping, which serialises all field names in camelCase. inline_data and mime_type are the protobuf field names; their JSON wire representation is inlineData and mimeType. Laravel’s Http::post() passes the array directly through json_encode(), so snake_case keys hit the API verbatim and fail validation.
This is a silent trap because some Google documentation examples show snake_case (proto notation), and the error message itself is misleading — it says the field “could not be found” rather than “wrong case”.
The Fix
$res = Http::withHeaders(['Content-Type' => 'application/json'])
->post($url, [
'contents' => [[
'parts' => [
['text' => 'What is the total?'],
['inlineData' => [ // ← camelCase
'mimeType' => 'image/jpeg', // ← camelCase
'data' => base64_encode(file_get_contents($path))
]]
]
]]
]);
That is the only structural change required. 150 KB is well within the 20 MB inline limit for the generateContent endpoint, so you do not need the File API for this use case.
A few things worth flagging:
1. Memory ceiling on file_get_contents + base64_encode in-process
base64_encode(file_get_contents($path)) loads the full binary into PHP memory and then inflates it by ~33%. For receipt scanning at scale this is fine, but if this ever moves to document ingestion (multi-page PDFs, high-res photos), you will hit memory limits under queued jobs. Establish the habit of streaming via the File API early rather than refactoring later.
2. The Http facade silently drops Content-Type in some scenarios
When you pass an array to Http::post(), Laravel auto-sets application/json. If you ever switch to Http::asForm() or Http::attach() for multipart uploads, the content type changes and Gemini will return a different, equally confusing error. Be explicit with ->withHeaders(['Content-Type' => 'application/json']) in production code.
3. No retry strategy on transient 5xx or quota 429s
The pattern above has zero fault tolerance. Gemini’s free and even paid tiers return 429s under burst load. Wrap this in a queued job with exponential backoff from day one:
retry(3, fn () => Http::post($url, $payload), 500); // or properly via a queued job with $this->release(backoff: [5, 30, 120])
4. Consider the official SDK over raw Http calls
Google ships google/generative-ai-php. It handles camelCase serialisation, retries, and streaming for you. Raw Http calls are appropriate for a tutorial, but production code should flag to your readers that maintaining the payload schema manually against a rapidly-iterating API (Gemini has had three breaking schema changes in twelve months) is a maintenance liability.
5. Forward-looking: multimodality routing
If this receipt-analysis feature grows, you will likely want to route by MIME type — JPEG/PNG via inlineData, PDFs via the File API, HEIC (common on iOS) needing conversion first. Design that abstraction now rather than bolting it on per request.

Noted, thanks.