Starting the project postcode FRS
I started some days ago to tinker with rust and some open data. I am a bit ashamed how dumb the project currently is and without README (original commit). I decided to publish it on Github anyway ASAP because I have nothing to loose.
The project is able to build a SQLite database from a JSON file. It now returns the dataset through an HTTP endpoint. Both features are handled by CLI binaries: one for each feature.
Current state
I have already shared a draft about transforming a JSON file into a SQLite database. I added a new binary to the project: a simple HTTP API server with one GET /api/v1/communes
endpoint. Yes it returns the entire 1.6MB dataset at the moment. It’s the minimum viable product. I will add filters next.
I use Axum for the HTTP API because I watched a great introduction of Jon Gjengset: Decrusting the axum create . There are many HTTP servers and API frameworks available. At the end of the day, I should only start by using one and make some experience with it.
Goals
The project should make data access easier. In this example, it focuses on the french city names and postcodes published by the french digital services. While the first draft works for postcodes and names of communes in France, it can be extended to other data.
It’s a step into bigger projects.
Why ?
The best reason is to learn Rust. The second best reason is built something, a bit useful, covering several subjects. Damn that pleases me.
The project scope is small but has many topics: database, sql queries, api, cli, binary management, serialization, deserialization, file handling, and more I’m missing. The scope can grows as needed but it can already be considered done since some days.
Why SQLite?
A sqlite database is easy to manipulate: it’s a simple file. A file has major advantages. It is easy to share. It does not need server (compared to PostgresSQL). It can be generated on the fly. It only relies on the filesystem. Hey the filesystem is provided (nearly) everywhere!
The sqlite db can be edited on the fly via multiple editors as desktop software or in a webbrowser. It’s perfect for prototyping and trying things.
So yes SQLite works in nearly every environment with few requirements and dependencies: I think of you LibSQL.
Next steps and extensions
Being able to filter the data from the GET endpoint. It’s a must.
A small UI can help to visualize these data and the API should be queryable as such:
- What are the communes matching a postcode ?
- What is the postcode(s) of a commune ?
- some statistics: how much commune is there pro postcode on average, min and max for each postcode, etc… ?
- Is there some differences between the official post name and the commune name ? Is there some errors or is it well thought and there are exceptions?
- Autocomplete a commune name
- what are all communes of the department 57? In this case, the french postcode matches the patttern 57xxx
- … more with imagination
Maybe a dashboard UI can answer these questions. Perhaps another type of user interface, such as a static website built from queries to the database.
The rust “database builder” binary could fetch the JSON directly from a source instead of relying on a local file.
I see a bright future and this project is only a small (extendable) step. A collection of SQLite databases can be flexible, simple and efficient to share and consume data. Each database provides the required data for a specific project with limited scope (KISS).
SQLite databases can be used in different ways::
- exposed as an HTTP API ready to be consumed
- consumed directly by another project that reads from the sqlite library of the programming language. Python has a sqlite library in the standard library.
- loaded
:in-memory:
for efficient data queries when it fits in the RAM. - merged together to create more relationships with the minimum amount of data for each use case and others I haven’t thought of yet.