Skip to content

Sending the Stock Update

To submit a stock/price update, the following endpoint is used and the stock update needs to be sent perdiocally to this endpoint:

https://merchants-connector-importer.zalandoapis.com/{Client ID}/{Stock File Name}

HTTP Request

  • PUT method

URL Parameters and Header

Parameter Type Example
Client ID String d329krpq
Stock File Name CSV 20181023_update.csv
API Key (x-api-key) String lXViWTzFic9sM8Qqe9Ew7JME8xTdBAOMJHdIjK7XkjQ00OWr

Requirements (MUST)

The request

  • Must be an HTTP PUT request.
  • Must contain the Client ID in the URL request ({Client ID} would be replaced). Example: d329krpq.
  • Must contain the stock file name that will be received ({Stock File Name} would be replaced), which can be different from your local file name. Example: 20181023_update.csv
  • Must include the stock CSV file.
  • Must contain the API Key in the headers (header name: x-api-key). Example: lXViWTzFic9sM8Qqe9Ew7JME8xTdBAOMJHdIjK7XkjQ00OWr.

Stores in multiple countries

If clients are operating in multiple countries each country will have a separate API Key and Client ID and are required to have separate stock feeds by country.

Code Samples

Shell

curl -v -X PUT https://merchants-connector-importer.zalandoapis.com/d329krpq/20181023_update.csv 
--upload-file 20181023_update.csv 
--header "x-api-key: lXViWTzFic9sM8Qqe9Ew7JME8xTdBAOMJHdIjK7XkjQ00OWr"

Powershell

Invoke-WebRequest `
-UseBasicParsing `
-Uri https://merchants-connector-importer.zalandoapis.com/d329krpq/20181023_update.csv  `
-Method PUT `
-InFile 20181023_update.csv `
-Headers @{'x-api-key'='lXViWTzFic9sM8Qqe9Ew7JME8xTdBAOMJHdIjK7XkjQ00OWr'}

Python

import requests

client_id = "d329krpq"
api_key = "lXViWTzFic9sM8Qqe9Ew7JME8xTdBAOMJHdIjK7XkjQ00OWr"

headers = {
    "x-api-key": api_key,
    "content-type": "application/csv",
    "cache-control": "no-cache",
}

file = "update.csv"
filename = "update.csv"

with open(file, "rb") as f:
    res = requests.put(
        f"https://merchants-connector-importer.zalandoapis.com/{client_id}/{filename}",
        headers=headers,
        data=f.read(),
    )

    if res.status_code == 200:
        print("Success")
    else:
        print("Failed")

Nodejs

const axios = require('axios')
const fs = require('fs')

const clientId = "d329krpq"
const apiKey = "lXViWTzFic9sM8Qqe9Ew7JME8xTdBAOMJHdIjK7XkjQ00OWr"
const fileName = "file.csv"
const data = fs.createReadStream(fileName)
const url = `https://merchants-connector-importer.zalandoapis.com/${clientId}/${fileName}`

const config = {
    headers: {
        "content-type": "application/csv",
        "x-api-key": apiKey,
        "cache-control": "no-cache"
    }
}

axios.put(url, data, config)
    .then((response) => {
        // handle success
        console.log(response);
    })
    .catch((error) => {
        // handle error
        console.log(error);
    })

Response

Response status of 200 (OK)

If the file upload is successful, the response status will be 200 (OK). A backend process will then pick up the task shortly and perform the updates if the file format and encoding is correct.

HTTP code 429. You have exceeded your capacity. Please try again in X seconds

The capacity here is defined by the amount of data sent to our system via the CSV files. The more stores that a partner has integrated to our system, the more capacity is granted to that partner. This mechanism is to protect our system against excessive data and provide fair competitiveness opportunity to all partners.

You may get this error when submitting a CSV file to our stock update endpoint. It means that you have been uploading too much data in a short amount of time. We need to reject this update and kindly ask you to cool it down and try again after X seconds.

You can also systematically retrieve X-seconds by looking at Retry-After header in the response and set up automatic retry mechanism from your end.

Other tips to avoid this error:

  • No need to send the same CSV file multiple times.

  • Try spacing out the uploads. For example if you need to send 10 files, send 5 files first and after 15 minutes send the other 5 files.