Recently in the area we’ve experienced a number of power outages. As it turned out, not all fo them were intermittent failures (due to weather conditions) - some of them were planned ahead.
I live in Kraków, so the electricity is managed by Tauron. If you live in south-west or north Poland, chances are that Tauron is distributing energy for you as well:
Fortunately, there’s a dedicated website where after filling in your adress you can get info regarding planned power outages: https://www.tauron-dystrybucja.pl/wylaczenia/
So if there’s a website, there surely should be an API. With API we should be able to scrape the power outages and after processing the data we could detect if any of the areas contains our street address.
In order to collect the data, I needed to find out how it’s prepared. I’ve visited Tauron Wyłączenia, opened up my browser’s DevTools and in the Network tab I started observing sent XHR requests:
Filling the whole form is required to get the results, but as I observed the sent requests, we only need to provide two things:
Make a GET request to the following URL, providing a city name in partName
parameter:
GET https://www.tauron-dystrybucja.pl/iapi/city/GetCities?partName=%C5%9Awi%C4%85tniki+G%C3%B3rne
In the response we should get something similar to this:
[ { "Name": "Świątniki Górne", "Gaid": "104815", "ProvinceName": "Małopolskie", "DistrictName": "krakowski" } ]
From that response we’re going to need the Gaid
value - we’ll use it in the next request.
In similar fashion, we’ll make a request to get our street ID.
Gaid
parameter from above should now be passed to ownerGaid
parameter.
Analogically, partName
should be filled with a string value of street name:
GET https://www.tauron-dystrybucja.pl/iapi/street/GetStreets?ownerGaid=106173&partName=STREET+NAME
This will return a value simimlar to this:
[ { "Name": "Aleja Adama Mickiewicza", "Gaid": "903472" } ]
Now having a Gaid
for our street, we can use it in our final request.
GET https://www.tauron-dystrybucja.pl/iapi/outage/GetOutages?gaid=903472&type=street
Response:
{ "CurrentOutagePeriods": [ { "Message": "Kraków ulica Balicka od 1 do 25; ul. Wernyhory ; ul. Bronowicka od 117 do 135A i od 136 do 146; Jabłonkowska od 1 do 3 oraz okolice", "StartDate": "2021-05-19T07:30:00", "EndDate": "2021-05-19T14:00:00", "OutageType": 1 } ], "FutureOutagePeriods": [ { "Message": "Kraków ulica Słowicza i Fischera – całość, Radzikowskiego od wiaduktu do Fischera oraz okolice", "StartDate": "2021-05-20T08:00:00", "EndDate": "2021-05-20T10:00:00", "OutageType": 1 }, { "Message": "Kraków ulica Zimorowicza 1a do 19,2a do 22 Zielińska 1, 7, Białoprądnicka 35, 37, 39, oraz okolice", "StartDate": "2021-05-21T08:00:00", "EndDate": "2021-05-21T15:00:00", "OutageType": 1 }, { "Message": "Kraków ulica Pylna 21 B C", "StartDate": "2021-05-24T09:00:00", "EndDate": "2021-05-24T15:00:00", "OutageType": 1 } ] }
Awesome! We not only get a data for planned outages in the future, but also currently ongoing outages/power failures somewhere around us.
Now it’s a matter of traversing FutureOutagePeriods[].Message
and checking if its values contain our street address.
I’ve prepared a set of two Items:
Switch ElectricityOutage "Czy jest planowane wyłączenie?" String ElectricityOutageRange "Kiedy?"
ElectricityOutage
is a simple ON/OFF switch that we can use to trigger any action in our home automation rules.
ElectricityOutageRange
however will contain a combined range of dates in a string form.
This formatted value will be passed to my notification service, so I know when exactly should I expect the power outage and be prepared in advance.
Let’s create a flow that every day at specific time will make a GET request to GetOutages
endpoint from Tauron.
Then we’ll process the response using a function
node.
Using a switch
node will allow us to perform different actions when there are any planned outage and when there are none.
Web scraping can be very useful in home automation. Note that public APIs like this must not be exploited. Always remember to limit number of request in such scenarios to minimum. I think It’s also polite to make these requests at times of lower traffic in general (in the night?).
Hope you find this quick tutorial useful!
I’ve recently learned about git scraping
which is a super convenient method of data collection in a git repository,
with all its features (like history).
GitHub made it really simple to create such repositories with automated data scraping.