A python script, running in a docker container, to read energy consumption and back delivery data from the digital “smart?” meter (as is installed in many homes in the Netherlands). Storing it in mysql for further use.
Somewhere in 2018, my old analog / mechanical energy meter broke down, and got replaced by a new digital one. The counters of these new ones are read remotely using an internal GSM connection. But… you also get a local serial port, called the P1 port, which allows the home owner to read the energy consumption data also.
So back in 2018, I searched the internet for info on how to read this data. I found different examples and approaches. I actually did build my own ESP8266 WiFi enabled micro controller version to read the meter also. But as I already had a small linux server near the meter, I went for a simple serial to USB cable, and connected it to the linux server. Some python examples further, I wrote my own version of the script, installed it natively on the linux box, and started measuring the data. And that was sort of it… I did not add a nice graph engine, that was for a later date 😉
Fast forward two years later; 2020. Still no graph engine (you know how that works with hobby projects), but storing measurements for two years already. However by this time my small linux server starts to freeze / lock-up every so often. My guess is temperature issues, combined with old age.
As I needed some NAS disk storage, I found myself a NAS which is capable of running docker containers. Quite nice! A Synology DS918+. Having docker image capability allows me to move my stuff from the small linux server into the NAS.
One of the things to move was this energy meter measurement script.
Plan of attack
The migration plan (all using ssh command line, no NAS gui used):
- install mysql in a container.
- install phpmyadmin in a container (sql web gui).
- install python, a cron-daemon, and the script in a container.
- connect the serial to usb converter cable.
- glue everything together, and get it to work.
- mysql seemed to use an authentication module, not supported by the python client library.
- my python script needed a library which is no longer in the ubuntu repository.
- my NAS needed some convincing to use the serial to usb converter.
Issue one was fixed by altering the password encryption module for the user which is used by the python script.
Issue two was fixed by using “pip install” instead of “apt-get install”. But… for later on the plan, I will have to upgrade the script from python 2 to python 3! Some deprecation warnings do show during the build.
Issue three was solved by loading some additional kernel modules in the NAS.
Just have a look at the README.txt file in the git project… It’s quite extensive, and you need a copy of the code and files anyway.
In summary; edit the files to match your setup. Put the IP number of your docker server in some of the files, and update the database password if you like (not really needed if it’s not accessible from the internet). Make sure the serial port is on /dev/ttyUSB0, or edit some files to have the correct device in there. Build and start all containers.
Encore / Bonus
So… I did not have any graphing module at this time. It’s always some work to make a custom script/page for that, so that’s why it wasn’t done yet. However, why make your own? In the last years, at the office we started using Grafana. It’s a great tool for making dashboards with graphs on it!
And this did not cost more than a couple of minutes to get up and running, including the graph. Although in a crude first version. Probably fine for the next two years 😉
The steps were easy, see the README.txt again, at the bottom. In summary; create a docker volume to make sure the data is kept between restarts. Run a Grafana docker image connected to that volume. In Grafana, set up a datasource, connecting to the mysql database. And finally create the graph on a new dashboard. An example sql query is in the README.txt.
Some notes on the python script
It’s quite a while ago that I wrote the script, but there are many code comments in it. So just have a good read of it, in case it doesn’t fit your situation. Should be easy to change / fix / update.
Here some notes:
- At the top, the script opens the serial port, and connects to the database.
- There is a big map “obis_codemap”, which translates the codes at the start of the serial output lines into a human readable attribute label. It does follow standard version 5.0.2. for DSMR 5.0. And is taken from a manual, so should be fine.
- Then there is a helper function “parse_value”. It transforms some of the values into usable versions for sql.
- The code will look for a proper start line, and when found will switch to reading the lines one by one and use obis_codemap to map to usable data. Each data line is split on the round brackets in separate fields.
- The meter has an internal expansion bus (MBUS). In my meter, there is a (wireless) link with the GAS meter. In other cases you could have some other extra connections. You may need to tweak lines 139-146 for your situation.
- Lines 153-160 handle different cases of number of fields and data sizes, also depending on partial field names.
- 164-165 are used for the MBUS extensions, to memorize the type number. The serial data (called a datagram) is read top to bottom, and the MBUS#_DEVICE_TYPE value will determine the meaning of the data lines that follow.
- 168-171, printing the data line. Format: original input, followed by –> and parsed data.
- 175-180, look for end of message, and keep track of total message to use later for CRC checksum calculation.
- 189-199, checksum handling. On error exit without storing measurement.
- 202-207, insert data in database, and close.
Back in 2018, it was a nice project to find out how to read the P1 meter port. You can find many ways to do it on the internet. Lots of data available.
Migration from native python into docker was not a big thing. Of course I took some shortcuts. You can make the container smaller, less build layers, and use internal docker networks instead of using the host IP and external port mappings. But this was just a hobby thing. Not an enterprise solution.
Nice bonus was the easy / quick addition of the graph using Grafana.
To get better graphing performance, it would be good to look at using InfluxDB instead of (or next to) mysql. Influx is meant for storing time-series data, and fast querying of it. Perhaps some other time…
Thijs Kaper, February 3, 2020.