Raspberry Pi Sensor Data→Firestore→ChatGPT→Gmail

Google Firestore

Over the weekend, I took another step in building my Raspberry Pi sensor network. I wrote a Python script that moves the data I’m collecting from my sensors (temperature, humidity, and soil moisture) into a Google Firestore database. Up until now, the data from my Raspberry Pi has been written to a Google Sheet every twenty minutes using a cron job. That setup worked well for the first phase of the project because it let me learn the basics of reading sensor data and working with the Google Sheets API.

Now that I have a steady stream of data, I want to do more with it. The Raspberry Pi sits behind a firewall at work, which means I can’t always access it directly, but the Google Sheet data is available anywhere. My goal for the second phase was to use Firestore as a more flexible database that could support other integrations later on. Setting up a new Firestore project in Google Cloud was simple enough, but I had to slow down and really think through the “service account” permissions and JSON key creation. These are small details, but they’re part of a bigger picture I’m learning to plan for before starting to code. My last post introduced the idea of Model Context Protocol (MCP) systems. Writing this Firestore script reminded me how important it is to think like those systems do: anticipating how data flows from one service to another and preparing it for whatever comes next.

Once I had the data flowing into Firestore, I added another layer to the workflow. The script connects to the OpenAI API and sends the last day’s growing condition data along with a short prompt asking for a natural-language summary suitable for a farmer. I asked for four items:

  1. Overall conditions (temperature and humidity)
  2. Soil moisture status
  3. A practical recommendation—something like irrigation timing, frost risk, or pest watch
  4. A calculation for evapotranspiration (ET) based on temperature and humidity

The response is then passed to Gmail, which sends me the summary automatically. To make that work, I had to enable the Gmail API and create an OAuth 2.0 Client ID so the script could send messages through my own account. Once the permissions were set and the bugs were worked out, the process ran smoothly.

After all of that, here is my resulting email in my inbox.

I’ll clearly need to do some work on prompt engineering to get a good result. Once I have weeks or months worth of data, I can ask for trends or have a daily accumulation of Growing Degree Days, annual rainfall, or total ET for the season. etc. One current limitation is using a foundational LLM like ChatGPT. The ag community needs a trained AI model for these types of applications. I do not plan on taking on that project for now, maybe I can play a part one day.

Right now, the script isn’t set to run on a schedule, but it can move data from Google Sheets to Firestore, generate a short report through OpenAI, and send that report to my inbox. My next step is to schedule it as a daily cron task so I can start getting these updates automatically each morning. I’ll have to think about what machine I want to be running these continuous scripts. Over time, I can refine the prompt to include seasonal or location-specific advice based on the latest readings.

The next phase will be posting the project on GitHub as part of my AI and agriculture portfolio. I have more soil and temperature sensors on the way, which will help me expand the dataset and continue exploring how automation and AI can improve decision-making in the field.