A full-featured HTTP API for addok
- Node.js 18 LTS and above
- redis 7
- wget or curl
- unzip
- python 3.10
- valid addok install (conform with addok.conf)
cp .env.sample .envYou can add ADDOK_DATA_URL to your .env file to automatically download and extract data when running init-data.sh.
| Environment variable name | Description |
|---|---|
ADDOK_CONFIG_MODULE * |
Path to addok configuration file |
SQLITE_DB_PATH |
Path to addok SQLite database |
ADDOK_FILTERS |
A list of fields to be indexed as available filters |
ADDOK_CLUSTER_NUM_NODES |
Number of nodes to instantiate (default to number of CPUs) |
ADDOK_REDIS_URL |
Connection string to addok Redis instance (can be an array) |
ADDOK_REDIS_DATA_DIR |
Path to Redis data dir (in case you want addok-server handle its own redis-server instance) |
ADDOK_REDIS_STARTUP_TIMEOUT |
Limit time allowed to Redis to start when using managed Redis |
PYTHON_PATH |
Path to python executable to use |
ADDOK_DATA_URL |
URL or local path to addok data archive (zip). If set, data will be automatically downloaded and extracted to data/ directory |
*Required
If you want to use the currently downloaded data :
ADDOK_CONFIG_MODULE=data/addok.confSQLITE_DB_PATH=data/addok.dbADDOK_REDIS_DATA_DIR=data/
npm install && npm run startredis-serverAssuming you already follow this step Copy and edit env file
Set the ADDOK_DATA_URL environment variable to automatically download and extract the data:
docker run -p 5000:5000 \
-e ADDOK_DATA_URL=https://adresse.data.gouv.fr/data/ban/adresses/latest/addok/addok-france-bundle.zip \
addok-serverOr using a local file mounted as a volume:
docker run -p 5000:5000 \
-v /path/to/addok-france-bundle.zip:/app/archive.zip \
-e ADDOK_DATA_URL=/app/archive.zip \
addok-serverIf ADDOK_DATA_URL is not set, you can mount a pre-extracted data directory:
docker run -p 5000:5000 \
-v /path/to/extracted/data:/data \
addok-serverYou can use the initialization script to automatically download and extract data.
With environment variable:
export ADDOK_DATA_URL=https://adresse.data.gouv.fr/data/ban/adresses/latest/addok/addok-france-bundle.zip
./init-data.shWith .env file:
Add to your .env file:
ADDOK_DATA_URL=https://adresse.data.gouv.fr/data/ban/adresses/latest/addok/addok-france-bundle.zip
Then run:
./init-data.shWith a local file:
export ADDOK_DATA_URL=/path/to/addok-france-bundle.zip
./init-data.sh- Download
wget https://adresse.data.gouv.fr/data/ban/adresses/latest/addok/addok-france-bundle.zip -O data.zip- Extract
unzip data.zip -d ./data- Remove zip archive
rm data.zipnpm install && npm run startGo to data path and run redis-server
cd data && redis-serverin the examples we assume that the server was started on the default port : 5000
| Param | Description | Default |
|---|---|---|
q |
Text input to geocode (required) | |
autocomplete |
Auto-complete mode (boolean) |
false |
lon, lat |
Coordinates of reference position | |
limit |
Number of returned results | 5 |
filters |
Additional filters (depend on addok config) | {} |
Filters support multiple values using two syntaxes:
Using + separator:
?citycode=59000+59100+59200
Using repeated parameters:
?citycode=59000&citycode=59100&citycode=59200
Both syntaxes can be combined and will be deduplicated.
example :
curl "http://localhost:5000/search/?q=lil&autocomplete=1&limit=15"
example with multiple filter values:
curl "http://localhost:5000/search/?q=lille&citycode=59000+59100&type=municipality"
| Param | Description | Default |
|---|---|---|
lon, lat |
Coordinates of reference position (required) | |
filters |
Additional filters (depend on addok config) | {} |
Filters support multiple values (see /search documentation above).
example :
curl "http://localhost:5000/reverse/?lon=2.2&lat=48.12&type=locality"
example with multiple filter values:
curl "http://localhost:5000/reverse/?lon=2.2&lat=48.12&type=municipality+locality"
This endpoint allows you to process multiple requests in a single POST request. You must send an array of requests that will be processed in parallel.
| Param | Description |
|---|---|
requests |
An array of requests to process. Each request must be an object containing the keys id, operation and params (required) |
params |
Optional global parameters applied to all requests |
Request object
| Key | Description | Value example |
|---|---|---|
id |
Identify each operation | "foo" (string) |
operation |
Define wich operation to execute | "geocode" or "reverse" (string) |
params |
Object with same params used for "geocode" or "reverse" | {"q": "lille"} (geocode) - {"lon": 2.2, "lat": 48.12} (reverse) |
Filters can be specified:
- As strings with
+separator:"citycode": "59000+59100" - As arrays:
"citycode": ["59000", "59100"] - In global
params(applied to all requests) - In individual request
params(merged with global params)
batch body request example
{
"requests": [
{"id": "foo", "operation": "geocode", "params": {"q": "lille"}},
{"id": "bar", "operation": "reverse", "params": {"lon": 2.2, "lat": 48.12}}
]
}batch with filters example
{
"params": {
"filters": {"type": "municipality+locality"}
},
"requests": [
{
"id": "foo",
"operation": "geocode",
"params": {
"q": "lille",
"filters": {"citycode": ["59000", "59100"]}
}
}
]
}The CSV file must be passed via data parameter like this : data=@path/to/file.csv
example :
curl -X POST -F data=@path/to/file.csv http://localhost:5000/search/csv/
You can define the columns to be used via multiple columns parameters
example :
curl -X POST -F data=@path/to/file.csv -F columns=street -F columns=city http://localhost:5000/search/csv/
Filter parameters specify column names from the CSV file to use as filter values (not direct filter values).
Available filters are defined by the ADDOK_FILTERS environment variable.
example with filter columns:
curl -X POST \
-F data=@path/to/file.csv \
-F columns=street \
-F columns=city \
-F citycode=code_insee_column \
-F postcode=code_postal_column \
http://localhost:5000/search/csv/In this example:
citycode=code_insee_columnmeans "use thecode_insee_columnfrom CSV as citycode filter"postcode=code_postal_columnmeans "use thecode_postal_columnfrom CSV as postcode filter"
The CSV file, encoded in UTF-8 must be passed via the data parameter. It must contain lon and lat columns
example :
curl -X POST -F data=@path/to/file.csv http://localhost:5000/reverse/csv/
Filters work the same way as /search/csv (see above).