Noob python temp sensor, Feedback request

first of all I’m not a pro. But i would like to have some feedback on the code I’m running,

Situation: 5 DS18B20 sensors conected to pi, I want them to read out temprature and safe it in some kind of log file.

Steps,
i symlinked the w1_slave files to readable names in my working directory, sensor1, sensor2, sensor3 etc

python script to read them and put it excel,

import datetime
import time
import pandas as pd

i = 1
while i < 9:

   a = open ("sensor1", "r")
   b = open ("sensor2", "r")
   c = open ("sensor3", "r")
   d = open ("sensor4", "r")
   e = open ("sensor5", "r")
   howLate = datetime.datetime.now()
   day = datetime.date.today().year
   
   sensor1 = int(a.read()[69:74]) / 1000
   sensor2 = int(b.read()[69:74]) / 1000
   sensor3 = int(c.read()[69:74]) / 1000
   sensor4 = int(d.read()[69:74]) / 1000
   sensor5 = int(e.read()[69:74]) / 1000

   
   existing_file = 'temp.xlsx'
   new_data = {'time': [howLate], 'sensor1': [sensor1], 'sensor2': [sensor2], 'sensor3': [sensor3], 'sensor4': [sensor4], 'sensor5': [sensor5]}
   df_new = pd.DataFrame(data=new_data)
   df_existing = pd.read_excel(existing_file)
   df_combined = df_existing.append(df_new, ignore_index=True)
   df_combined.to_excel(existing_file, index=False)
    
   a.close
   b.close
   c.close
   d.close
   e.close
   time.sleep(2)
   i += 1

I made a crontab to run this script every minute, and an other one to refresh the xls file every hour, and copy the old file to xls file with time stamp.

Questions i have,

What would be best practice to solve this ?
Is it bad i used excel ? why would i use Json for example ?

Just a few toughts of the top of my head…

You should think about using JSON, it is much more portable than xlsx, just about anything can parse it.

Saving data in JSON opens up a possibility of having an Excel file that connects to it and treats it as a data source. Than you can have massively complicated Excel thing (if you want to), but still log in simple text format. Decouple it, you will thank me later :slightly_smiling_face: .

Use error handling in your script, if something goes wrong you don’t want it to just crash, and possibly corrupt your file.

You have fixed number of iterations, using for loop is the “Python way” to handle definite iterations.

Don’t write to a file in each iteration, collect all data to a variable, than flush it to a file at the end of the loop.

3 Likes

original i wanted Json, but i have no experience with it so formating the text that the json conversion went wrong, well second entry i could read in javascript after that , what would be a good resource to read about json.

well i know the answer there w3school :stuck_out_tongue:
going to look in it.

I was thinking about removing the loop completly and just have a data point every minute that would be enough. at this point i want to see what happens .

Error handeling.
posible erors would be, a sensor disconect. but then the program is chucking on, just printing time stamps and 0 degree’s out put. so that would be clear from the log file.

well that sounds like i’m mising the point of error catching in this case. But i think that is a case of i’m understanding somthing not. Where can I read about catching errors in Python. And what errors would you catch ?

Did you try and format JSON manually? You don’t need to, both Python and JavaScript have built in ways of dealing with it.

Errors come in many forms, could be human errors, misplacing a file, denying access to it, have it locked by some other process, sensor disconnection, power loss recovery that went bad… the point of error handling is for you to know what exactly happened and work around it.

1 Like

Looks like a fun project. The format kind of depends on how you want to consume the data, e.g. do you want to plot it out in python or make a graph in excel?

One of the most fungible formats for data in my experience is good old CSV (comma separated variables).

You could probably write a few lines of bash script to do it something like so:

$ while true;do sleep 1;echo -n $(date),;echo -n sensor{1..5},;echo '';done

But you’d want to echo -n $(cat sensor{1..5}), to get the values which could be parsed later. Or keep building on this to parse it up front etc.

I love me some pandas, but it is a pretty heavy tool to hit this nail.

Happy hacking and good luck with your project!

2 Likes

Latest Excel is pretty good in parsing whatever you throw at it. You can use PowerQuery to transform any data you want. Even on MacOS which was a pain in the butt to do.

Having said that, yeah CSV another good option - it will be very easy to just append a line and be done with it.

Just a small correction, CSV stands for Comma Separated Values. No variables there, which might be a bit confusing in context of programming for noobs.

3 Likes

@ubergarm @vivante

Thanks for all the tips, I’m going to look to improve my code, And then i will show, The tutorials i find on these sensors are nice, but i don’t feel like I’m learning from them. And like this i feel like I’m learning

Reminds me of that time i had to demolish a Louis Vuitton office without powertools

while writing respons i already was trying bits out, panda’s might be a heavy hammer. but cat is what taking the most time, if i cat a single sensor it takes about 2 sec, but doing the sensor{1…5} option takes about 13 seconds.

comands i use

tail -n 1 sensor1|rev| cut -b -5|rev
echo -n $(tail -n 1 sensor{1..5}|rev|cut -b -5|rev)

but its all slow, python to, C on the uno r3 was mutch faster, C will be faster on the pi to, but i’m not sure if i’m on the point that i can go there. well that are just some thoughts

1 Like

what should be between the json entry’s ? now i got a no whitespace error should be somthing between there.

{
 "time": [
  "2024-09-19 14:56:53.262428"
 ],
 "sensor1": [
  26.437
 ],
 "sensor2": [
  26.812
 ],
 "sensor3": [
  27.25
 ],
 "sensor4": [
  26.562
 ],
 "sensor5": [
  21.75
 ]
}{
 "time": [
  "2024-09-19 14:57:02.192649"
 ],
 "sensor1": [
  26.437
 ],
 "sensor2": [
  26.812
 ],
 "sensor3": [
  27.25
 ],
 "sensor4": [
  26.625
 ],
 "sensor5": [
  21.75
 ]
}

i asume the rest of the output is good, while i don’t have the knowledge to know the output is good. now i’m think i’m wrong about that to

Oh strange that cat is slow… it is c under the hood and would be roughly equivalent to python or another languages read() function. But you tried it and the proof is in the eating of the pudding as they say. Maybe something with the driver or pi dunno.

Regarding your JSON, typically you would create an array (list object in python) e.g. put a comma between each “entry” and put them all in between brackets [] e.g.

[
  {
    "key": "value"
  },
  {
    "key": "value2"
  }
]

This can get a bit confusing as a trailing , on the last entry is technically not valid JSON… Also you have to remember to close the last ] when your program exits. While possible, this is not ideal imo.

You could go with JSONL format which is just one JSON blob per line. It is common to use newline \n character for parsing text streams.

I copy pasted your data into a file called mongoosh.json and used jq to convert it like so so you can see what it looks like in JSONL.

$ cat mongoosh.json | jq -c '.'

{"time":["2024-09-19 14:56:53.262428"],"sensor1":[26.437],"sensor2":[26.812],"sensor3":[27.25],"sensor4":[26.562],"sensor5":[21.75]}
{"time":["2024-09-19 14:57:02.192649"],"sensor1":[26.437],"sensor2":[26.812],"sensor3":[27.25],"sensor4":[26.625],"sensor5":[21.75]}

Though it is a bit odd to make your values an array… If you have time, read up a bit on python list and dict data structures. Getting clear playing with those and using them together will help your JSONfoo.

For example you have:

"sensor1": [ 26.437 ]

Given there will only ever be one value for time and sensor readings it’ll be easier to parse/unpack it later if it is just a single key:value pair e.g.:

"sensor1": 26.437

So your data would then look like this, with one entry per line:

{"time":"2024-09-19 14:56:53.262428","sensor1":26.437,"sensor2":26.812,"sensor3":27.25,"sensor4":26.562,"sensor5":21.75}
{"time":"2024-09-19 14:57:02.192649","sensor1":26.437,"sensor2":26.812,"sensor3":27.25,"sensor4":26.625,"sensor5":21.75}

However, I’d still suggest considering CSV (comma separated value [thanks @vivante for correcting me!]) as it convenient to just append one line for each entry. No need to close any brackets or worry about trailing commas if your program stops. Also it is much less “wordy” to keep it DRY (don’t repeat yourself) control the structure of the CSV data.

So make it easy on yourself writing the most simple most fungible format. Once you have CSV it can become anything else using tools like mlr e.g.

$ cat mongoosh.csv

time,sensor1,sensor2,sensor3,sensor4,sensor5
"2024-09-19 14:56:53.262428",26.437,26.812,27.25,26.562,21.75
"2024-09-19 14:57:02.192649",26.437,26.812,27.25,26.625,21.75

$ mlr --icsv --ojson cat mongoosh.csv
[
{
  "time": "2024-09-19 14:56:53.262428",
  "sensor1": 26.437,
  "sensor2": 26.812,
  "sensor3": 27.25,
  "sensor4": 26.562,
  "sensor5": 21.75
},
{
  "time": "2024-09-19 14:57:02.192649",
  "sensor1": 26.437,
  "sensor2": 26.812,
  "sensor3": 27.25,
  "sensor4": 26.625,
  "sensor5": 21.75
}
]

Anyway, just my own opinions as a data wrangler and plumber… Even Excel and Pandas gobble up simple CSV files like this no problem.

Happy hacking and enjoy all the command line power tools at your disposal to demolish this Louis Vuitton office! lol xD

1 Like

I first want json to work then i go on to csv. the slownes might be just that its an old pi. i think it’s one of the pie’s that did a couple of years 24/7 doing some streaming tasks on nginx

thanks for your clear examples this helps me a lot

1 Like

@ubergarm @vivante thanks for the feedback.
well csv is working, json not realy but kinda but kinda is completly not stil missing the [ on the start of the file and the ] on the end that last one is anoying because i’m atendening to the end of the file. Also did exceptions. now collecting data

  GNU nano 7.2                                                                                      read.py *
import datetime
import json

i = 1
for i in range(9):
   try:
       a = open ("sensor1", "r")
       b = open ("sensor2", "r")
       c = open ("sensor3", "r")
       d = open ("sensor4", "r")
       e = open ("sensor5", "r")
   except Exception as error:
       print ("Sensor Symlinks not found", error)
   howLate = datetime.datetime.now()
   day = datetime.date.today().year
   fileName = f"sensorlog-{day}.csv"
   jsonFile = f"sensorlog-{day}.json"
   try:
       sensor1 = int(a.read()[69:74]) / 1000
       sensor2 = int(b.read()[69:74]) / 1000
       sensor3 = int(c.read()[69:74]) / 1000
       sensor4 = int(d.read()[69:74]) / 1000
       sensor5 = int(e.read()[69:74]) / 1000
   except Exception as error:
       print ("Sensor data is empty", error)
   lineCsv = f"\"{howLate}\",{sensor1},{sensor2},{sensor3},{sensor4},{sensor5}\r\n"
   new_data = {'time': howLate, 'sensor1': sensor1, 'sensor2': sensor2, 'sensor3': sensor3, 'sensor4': sensor4, 'sensor5': sensor5,}
   print (lineCsv)
   try:
       #write csv file
       append = open(fileName , "a")
       append.write (lineCsv)
       #write json file this is going wrong with separation
       nosj = open(jsonFile, "a")
       drol =  json.dumps(new_data, default=str, ensure_ascii=True, separators=(',', ':'))
       nosj.write (", \n")
       nosj.write (drol)
   except Exception as error:
       print ("write error", error)
   i += 1

i did some googling about the sensors being slow, apperently its the sensors taking 750 ms to get data. otherwise its not accurate.

1 Like

Just a few more things…

You don’t need to keep track of index manually in for loops, it will do it for you. This way you reduce a chance of infinite loop by 1,079657%.

# will iterate from 0 to 8 inclusive
for i in range(9):
    print (i)

# or if you want to be specific about your indices, 1 to 9 inclusive
for i in range(1,10)
    print (i)

You don’t need to format JSON data yourself. For your use case CSV is perfectly fine, but if you get into more complex objects JSON could be more suitable, easier to parse. And while we are at it, you shouldn’t write data to files in every loop. In your case there is plenty of time to do it, but keep in mind RAM is orders of magnitude faster than any SSD.

import datetime
import json

data_list = []

for i in range(1, 10):
    # Generate some fake data
    sensor_data = {
        'time': datetime.datetime.now().strftime('%Y-%m-%d %H:%M:%S'),
        'sensor1': i * 0.1,
        'sensor2': i * 0.2,
        'sensor3': i * 0.3,
        'sensor4': i * 0.4,
        'sensor5': i * 0.5,
    }
    data_list.append(sensor_data)

with open('sensor_data.json', 'w') as json_file:
    json.dump(data_list, json_file, indent=4)

This is just the simplest possible example that will overwrite all your data in the file, you can try: to test if file exists, than read and parse it then append data and save. Catch any errors of course, for example if the file is corrupted start a new one and leave the old one there for analysis of why it was corrupted.

Log errors, don’t just print them out. You are not watching execution 24/7 and catching errors in real time. And errors will pop up where you don’t expect them.

Start looking into functions.
Let’s say this goes well and you think to yourself “I need this in 20 more places”. Think about how would you scale up, why not 20 much cheaper Pi Pico W boards gathering data and your main Pi contacting them via simple API to get the latest data.
You don’t want to repeat the code for each remote device, you want functions.

...
def read_remote_sensors(target):
    # in function body connect to specified target and read data, than use return value
    return value
bathroom_data = read_remote_sensors(bathroom)
kitchen_data = read_remote_sensors(kitchen)
# now you can send data to a validation and saving function(s)
...

And last but not least, practice. I know AI can bang out code in no time and sure do take advantage of it, but practicing on your own will open your eyes to what even simple code can do. Ideas will come, you will start seeing solutions where others don’t even see a problem and it will make you better at what you do.

3 Likes

I’m now collecting some data, for a couple of days looking in doing some javascripty things to look at it. I orderd 10 new sensors. i hope better quality as that i got now. I can not change the resolution, and those are max now.

I experimented with uno r3 and then send the information rx - tx tx-rx over uart. this went for a bit but connection is crap loss at 9600 baud. also the max sensors i can read out on on the one pin in 64. And i think in the future this pi is also going to be on duty to control some fan’s in my rack. And some led strips for feedback about temp. I want to see the health of my rack. My pc is one thing. but the preamps. this is the plan i have now. and i have three spair sensors i will put one at or in my pc 1 i stick to the ssl and 1 stick to the drawmer. The botom don’t need sensors because that would be storage,

But for now i have to make somthing of a front end so i can see what’s happening. for now its only a dirty php script that shows current temp
But that is just to see that its working. and not defnitive.

<?php

header("Refresh:1");
$device_file_sensor1 =  'sensor/sensor1';
$device_file_sensor2 =  'sensor/sensor2';
$device_file_sensor3 =  'sensor/sensor3';
$device_file_sensor4 =  'sensor/sensor4';
$device_file_sensor5 =  'sensor/sensor5';

$data1 = file($device_file_sensor1, FILE_IGNORE_NEW_LINES);
$data2 = file($device_file_sensor2, FILE_IGNORE_NEW_LINES);
$data3 = file($device_file_sensor3, FILE_IGNORE_NEW_LINES);
$data4 = file($device_file_sensor4, FILE_IGNORE_NEW_LINES);
$data5 = file($device_file_sensor5, FILE_IGNORE_NEW_LINES);

$temperature1 = null;
if (preg_match('/YES$/', $data1[0])) {
    if (preg_match('/t=(\d+)$/', $data1[1], $matches1, PREG_OFFSET_CAPTURE)) {
        $temperature1 = $matches1[1][0] / 1000;
	}
}
$temperature2 = null;
if (preg_match('/YES$/', $data2[0])) {
    if (preg_match('/t=(\d+)$/', $data2[1], $matches2, PREG_OFFSET_CAPTURE)) {
        $temperature2 = $matches2[1][0] / 1000;

	}
}
$temperature3 = null;
if (preg_match('/YES$/', $data3[0])) {
    if (preg_match('/t=(\d+)$/', $data3[1], $matches3, PREG_OFFSET_CAPTURE)) {
        $temperature3 = $matches3[1][0] / 1000;
	}
}
$temperature4 = null;
if (preg_match('/YES$/', $data4[0])) {
    if (preg_match('/t=(\d+)$/', $data4[1], $matches4, PREG_OFFSET_CAPTURE)) {
        $temperature4 = $matches4[1][0] / 1000;
	}
}
$temperature5 = null;
if (preg_match('/YES$/', $data5[0])) {
    if (preg_match('/t=(\d+)$/', $data5[1], $matches5, PREG_OFFSET_CAPTURE)) {
        $temperature5 = $matches5[1][0] / 1000;
	}
}
    echo " <html><head><title>Temp Sensor</title> \n <LINK href=style.css rel=stylesheet type=text/css> \n </head><body>";
    echo "<div class=temp ;> \n";
    echo "<div class=" . $matches1[1][0] . ">Sensor 1 <br><Br><center><h1> ";
    echo  $temperature1 . "°C";
    echo "</h1></center></div> \n";
    echo "<div class=" . $matches2[1][0] . ">Sensor 2 <br><Br><center><h1> ";
    echo  $temperature2 . "°C";
    echo "</h1></center></div> \n";
    echo "<div class=" . $matches3[1][0] . ">Sensor 3 <br><Br><center><h1> ";
    echo  $temperature3 . "°C";
    echo "</h1></center></div> \n";
    echo "<div class=" . $matches4[1][0] . " >Sensor 4 <br><Br><center><h1> ";
    echo  $temperature4 . "°C";
    echo "</h1></center></div> \n";
    echo "<div class=" . $matches5[1][0] . ">Sensor 5 <br><Br><center><h1> ";
    echo  $temperature5 . "°C";
    echo "</h1></center></div> \n";
    echo "</div> \n";
?>
2 Likes

If you are up for it, build log would be amazing!

2 Likes

If you are looking at other devices. nodemcu with easy esp works pretty well. You can send the data out to mqtt and then show it with anything that can access that (like home assistant, grafana or your own script)

Ofcourse doing everything yourself is a great way to learn!

Like this:
image

3 Likes

I will do that. i Think its fun. Also when i ordered the new ds18b20 and i saw the original manual that go’s with the sensor. I was hit by the fact that i really miss well written manuals. Because they have become some kind of entity that is almost extinct.

isn’t this poetry. like a manual like this that is handy. How often do u still see manuals like this for hardware.
https://cdn.bodanius.com/media/1/4ee1359_ds18b20-datasheet.pdf

Those sensor’s they suprise me. I never thought it would be that easy to get them working. And the fun coding they gave me until now. (I’m no pro coder i do this as a hobby, and writing code calmed me down sinse i was a kid,

i love using functions but now i’m stil prototyping, i just want to see the steps on paper in a kind of way and to many functions and loops abstract the proces to mutch for me to get a clear picture. I honestly hate that loop i have no to get more data points. but i want more data points. but in a way make’s that loop then useless because the timing is all over the place.

But yeah your are absolutly right for next step logs in file, and functions in functions. Because that also give’s cleaner code

damit auto safe of visual studio i had a old window open, arg that is why u just use nano

1 Like

Its a bit the same, i have some pi’s, and yeah there are tailor made solutions but i like doing it like this.

its quality time with my computer and hardware i just enjoy doing this

1 Like

code update

import datetime
import json
from multiprocessing.dummy import Pool as ThreadPool


def read_sensor(sensor_n):
   try:
       fName = "sensor"+str(sensor_n) 
       file = open (fName, "r")
       return  int(file.read()[69:79]) / 1000
   except Exception as error:
        msg = open("error.log", "a")
        msg_error = ("write error", error)
        write.msg(msg_error)
        msg.close()

def write_logs(sensor1,sensor2,sensor3,sensor4,sensor5):
   try:   
       howLate = datetime.datetime.now()
       strLate = (howLate.strftime("%Y-%m-%d %H:%M:%S"))
       lineCsv = f"\"{strLate}\",{sensor1},{sensor2},{sensor3},{sensor4},{sensor5}\r\n"
       new_data = {'time': strLate, 'sensor1': sensor1, 'sensor2': sensor2, 'sensor3': sensor3, 'sensor4': sensor4, 'sensor5': sensor5,}
       print (lineCsv)
       #write csv file
       append = open("sensorlog-2024.csv" , "a")
       append.write (lineCsv)
       append.close()
       #write json file this is going wrong with separation
       nosj = open("sensorlog-2024.json", "a")
       drol =  json.dumps(new_data, default=str, ensure_ascii=True, separators=(',', ':'))
       nosj.write (", \n")
       nosj.write (drol)
       nosj.close()
   except Exception as error:
       msg = open("error.log", "a")
       msg_error = ("write error", error)
       write.msg(msg_error)
       msg.close()
sensors = [1,2,3,4,5]
for i in range(11):
    pool = ThreadPool(8)
    results = pool.map(read_sensor, sensors)
    write_logs(results[0],results[1],results[2],results[3],results[4])

added functions writing errors and multithreading for a bit more speed