After my last astronomy post, I was curious if I could develop a
cosmic sense of direction, to know, day or night, where Sirius is the
same way I generally know which way is north.
So I built this interactive sidereal clock. I wil explain a little bit
about how it works and then how I got there.
(This is a screenshot. Click through to see the real version.)
About the clock
The clock displays both solar time (24 hours per day) and sidereal time (four
fewer minutes per day, but measured as 24 sidereal hours). The outside circle
shows the sidereal hours, and the inner circle (representing Earth) shows the
solar hours.
Just like 12:00 solar time is when the Sun is at its peak, 6:00 sidereal time
is when Betelgeuse in Orion is at its peak. Because stars are so far away,
they don’t seem to move much relative to each other, so every (non-polar) star
has a fixed sidereal time when it reaches its peak. This is also called
right ascension, abbreviated RA.
The right ascension is essentially a measure of longitude, and at 6:00, the 6ʰ
line runs from Polaris, straight across the highest point of the sky and down
to the southern horizon, where out of sight from the northern hemisphere it
hits the south celestial pole. At the same time, the 12ʰ line goes from
Polaris down to the eastern horizon, and the 0ʰ line to the western horizon.
Also at that time, a small part of the 18ʰ line is visible in the northern
hemisphere, from Polaris straight to the northern horizon.
In the interactive clock, you can drag the Sun around to represent different
times of year. This shows you the relative position of the Sun, Earth, and
stars at that time of year. You can see that the winter solstice is when the
Earth is between the Sun and Orion. In other words, the northern hemisphere is
always, regardless of the season, tilted towards Orion. (That won’t last
forever, but it will for my lifetime.) I marked the four seasons on the clock,
and when the Sun circles a season, that represents the solstice or equinox
(again, in the northern hemisphere).
You can also drag the little human standing on Earth to adjust the time of day.
The perspective of the Earth is looking down from above the north pole.
It is easiest to imagine yourself standing on Earth looking south, with the
eastern horizon on your left and the western horizon on your right. Although,
again, to see the entire line corresponding to the current sidereal time, you
will have to look up, up, up, and a little backwards, until you see Polaris.
You can see the difference between sidereal and solar time, because if you drag
the guy once around, the Sun will have moved slightly so you have to drag a bit
further to complete a solar day.
The effect of the Earth’s tilt on the Sun’s position is not shown here. The
Sun’s RA is correct, though. The relative movement of the Sun above or below
the Earth’s equator is what causes it to rise somewhat before or after 6am,
depending on your latitude and the season.
Choices
Although I’ve never had a very good grasp on astronomy, it is not for lack of
experience. When I was young, my mom was very interested in astronomy, and we
used to drive out to the potato fields of Idaho to see the night sky. We had a
Planisphere. We had star
mapping software, probably DOS-based, but I don’t remember which. But
everything, to me, always felt like it was moving, and never in a way I could
understand or visualize. (I think most astronomy software is especially
confusing in this respect.)
For this reason, I really wanted something that let me imagine the stars
holding still, while the Earth and Sun moved. As far as I know, there isn’t a
“standard” way to depict the night sky, but to my math brain it made sense to
put 0 on the positive x axis. A top-down view allows you to have the cardinal
directions in their normal locations, although it prioritizes the southern sky
and makes the clock run counterclockwise. At my latitude, the equatorial and
mid-southern cosntellations make up more of the sky anyway. If I still lived in
Idaho or the Pacific Northwest, it might make more sense to flip things around
and look up at the earth from below, since the polar constellations are so much
more prominent and probably an easier way to orient yourself.
Technology
I used Javascript (mostly d3) to build an SVG file. At
first I had sliders for the time of year and time of day, and then I realized I
could just watch for drag events on certain parts of the image. Feel free to
take a look at the code, it is only a few hundred lines of very spaced out
code.
I used a Planisphere to make sure that I was getting things correct.
Success?
I had a ton of fun making this. The process of working everything out really
helped it click into my brain.
Ultimately, the easiest way to orient myself seems to be to think first about
where the Sun is in relation to the stars, and then where I am in relation to
the Sun, rather than think explicitly about sidereal time. Which is essentially
how it has been done for thousands of years with the Zodiac, but I find that
the extra step of fixing the Earth’s orbit as up-down-left-right is critical
for my spatial awareness.
I don’t know how anyone can argue that LTE is “fast
enough”
compared to 5G. Seeing LTE in the status bar is enough to convince me to put my
phone down for a while.
And millimeter wave is amazing. I get faster than 1 Gbps cellular download at
my office, so that I don’t even bother connecting my phone to wifi.
In a recent post,
I made a bit of an eye-roll comment about how Letterboxd rejects requests
based on their user-agent header. Then today, it happened when I was making a
request to my own website. I know (or thought) that I don’t do anything
so ridiculous, so I set out to investigate.
I host my site these days on a Linode instance using nginx, which I was pretty
sure doesn’t do anything strange based on user-agent headers. Looking at
my logs showed that my request wasn’t making it to ngnix.
When Google Domains shut down, I migrated my DNS hosting to Cloudflare, so that
was the next possible culprit. Cloudflare offers traffic proxying, so that
all of your traffic goes to their servers, and they send requests to your
servers. This isn’t something I was looking for when I switched, but they make
it easy or maybe even the default.
Inspecting the actual response that I was getting in Python, I saw the message
error code: 1010. (For Letterboxd, I had seen the 403 response and hadn’t
looked deeper into it, but it was probably the same thing.) Some searching
confirmed that this was indeed coming from Cloudflare.
If you go to “Security > Settings” in the Cloudflare dashboard, you can turn
off “Browser Integrity Check,” which fixes the problem.
Or just turn off proxying entirely, although that takes a little longer to take
effect, because the DNS settings need time to propagate.
What started off as a quick project to get the moon’s position each day turned
into something bigger, after I realized that I knew so little about
astronomy that even my grandparents would be ashamed.
What made it fun was the amazing amount of data and resources I had to convert
whatever questions I had into graphs and pictures, until I finally reached the
level of understanding of the first chapter of an introductory astronomy
textbook.
What I knew going into this project was that the moon rises and sets from
seemingly random parts of the sky, unlike the sun which, where I live, slowly
progresses from southeast in the winter, to east at the equinoxes, and
northeast in the summer. (Again, if you know anything about astronomy, you
already understand why this happens, but bear with me for the charts, at
least.)
I found the excellent Skyfield Python
library, which downloads a set of position and velocity data for solar system
objects, and lets you make computations based on these. (Note: I didn’t design
that website, even though it looks uncannily similar to this one.) This is
what I was using anyway to find out when the moonrise and moonset were, the
current phase of the moon, and the location of the moon.
Here is how you find out the next moonrise, for example:
This actually gives you all moonrises in the next 26 hours, which will usually
be a Numpy array of length 1. What the library is doing is using the location
of the earth and moon from the EPH file (which the library auto-downloads) to
determine where the moon is with respect to you and your current horizon, and
find out when it crosses into view.
If you want to find the current position of the Moon, you can use
There is already fun stuff going on here (although mostly irrelevant for the
broader question). You start with the Barycentric coordinates of me, which
gives your current location in the solar system. Calling observe converts the
moon’s position to an Astrometric position, which takes into account the speed
of light (rewinding the moon’s position by 1.3 seconds). Then apparent takes
gravity into account. My instinct is that this is overkill for the moon, but
should have some effect on the planets.
Anyway, this gives you the apparent altitude of the moon, with 90° being
straight up, zero for on the horizon, and negative numbers for below the
horizon. And you get the azimuth from 0° to 360°, which is the compass
direction you should face to see the moon.
Everything in Skyfield works on Numpy arrays as well, so you can compute
several points at once. For example, here is a polar plot of one month’s worth
of moon paths, with a transformation of the azimuth so that 90° is at the
origin and 0° is on the unit circle. I set negative altitudes to np.nan so
that they won’t be drawn on the plot.
You also can use a different projection to get the same picture in a way that
looks a little more like how it would look if you were facing south and
watching it for the whole month. Note that this covers all phases of the moon,
including the new moon, which you might have a hard time seeing in real life.
It appears that the location of the moon throughout one night or day is
determined primarily by where it rises, so I made a graph for an entire year of
the azimuth at moonrise:
This finds all of the risings during the year as a Numpy array, then finds the
corresponding array of azimuths. For reference, I also found all of the moon
phases of the year (0 is new moon, 2 is full) and used that as the x-grid.
Here is where I first realized the pattern, which is that at the summer
solstice, when the sunrise is furthest north, the full moon is furthest to the
south (and takes the shortest path across the sky), while at the winter
solstice, when the sun is furthest south, the full moon is furthest to the
north (and takes the longest path across the sky). This makes sense because
when the moon is full, it is opposite the sun. Similarly, for new moons, the
moon matches the sun’s position, which also is what I expected.
I still didn’t really understand what was going on between, and for a brief
period, I thought that all of this movement was a result of the 5-degree tilt
of the moon’s orbit that I had read about. But the numbers were too big for
that, and quickly I realized that I didn’t even know whether the 5-degree tilt
was with respect to the earth’s equator or the earth’s orbit around the sun.
Doing a bit more poking around and looking at an actual textbook, I confirmed
that it was indeed the earth’s tilt that was the primary cause of the moon’s
apparent motion, that is, that the moon orbit stays close to the plane of the
earth’s orbit around the sun.
Everything finally clicked, and I realized that, due to the moon’s orbit, the
earth is tilted towards and away from the moon just like it is from the sun,
going through a cycle once a month rather than once a year. And of course,
slightly out of sync with the moon phase cycle, which is shorter than the
moon’s orbit cycle due to our revolution around the sun.
And then I learned what I should have remembered from high school, or from
knowledge passed on from my ancestors, that the ecliptic, the imaginary line
through the sky that coincides with the intersection of the earth with its
orbital plane, the line that contains all of the zodiac constellations, does
about the same wibble-wobble that the Moon does, and what I am really seeing is
the effect of the earth spinning along a different axis from its solar orbit.
Actually, this lack of basic knowledge is what tripped me up when I was first
reading about this, because in more than one place, I read that the moon went
further north or south from one night to the next because “it rises later.”
Which doesn’t make any sense at all, unless you think of the moon as
constrained to the ecliptic and the point where the ecliptic touches the
horizon as oscillating back and forth as in this
animation. My lack of
intuition about the stars also manifested when I read things like “the moon’s
orbit makes it appear to move west to east across the sky.” Which is silly
because obviously the moon appears to move east to west just like the sun and
all of the stars, but if you can imagine the star field as fixed, then you can
see the moon’s eastward movement.
That’s the end of my journey for now. It was fun. The solar system is exciting,
and I can’t believe how painstakingly ancient humans tracked the stars, sun, and
moon over time to gain all of this knowledge, when we can just download a file
and get to work.
Okay, one last graph that I drew. It shows the 12 crescent moons of 2025 and
their positions relative to the setting sun. At the bottom is the position of
the sun, just before sunset. At the top is the location of the moon at the same
time, about one day after new. The line that connects them is labeled with the
date. Here you can see clearly the difference between the spring moon, where,
as the moon orbits towards first quarter, it approaches the northern
hemisphere. Contrast that with the fall moon, which is headed towards the
southern hemisphere, and so will appear much further to the south, and set to
the south of the sunset.
Power substation
After all of Casey’s Home Assistant talk, I finally gave in and tried it out. The number of integrations it has with different devices is impressive, and the event-based programming you can do is fun. Template-focused programming is tedious, but almost anything is better than Homekit and (maybe this is a hot take) Shortcuts.
At the park
There are more important things going on in the world than map controversies, but the historical role of the U.S. government is not to dictate place names, but to create the database in order to facilitate commerce, uniformity, etc. Does the database maintainer have some amount of power over naming? Yes, but if the databases hadn’t existed and been reliable over the last 20 years, tech companies would have just made their own from scratch. If the official government sources become propaganda, you should stop depending on them.
Everyone knows time zones are hard. I know time zones are hard. But that didn’t
prepare me for the mess I got myself into dealing with incorrect time zones
between my Sony camera and Apple Photos.
Let me start with some conventions, so this doesn’t get any more confusing than
it has to be. I’m going to say naive time for the date and time that
you see on a clock, without any time zone information, like
2024-01-27 12:00:00. I will say absolute time to
mean an actual point in time, independent of time zones. This is usually
written as a naive time plus a UTC offset, such as
2024-01-27 12:00:00-0800 or
2024-01-27 15:00:00-0500. Note that those two absolute times
are in fact the same time. Changing UTC offsets is like a change of units,
nothing more.
I’m going to say time zone for the geographical location where a time
exists, which is usually specified by a major city in the zone, such as Los
Angeles. I will say offset to mean the difference between UTC and a
particular timezone at a particular time, which is usually written as -0800
for eight hours behind UTC.
The distinction between a time zone and an offset is subtle. The key difference
is that a time zone doesn’t change throughout the year (unless laws change
significantly), but the offset for a given time zone can, and often does due to
Daylight Saving Time (DST hereafter). You can’t figure out a time zone from a
time plus offset, but you can guess (Phoenix and Los Angeles have the same
offset in the summer, but they aren’t the same time zone). You can usually
figure out an offset from a naive time plus a time zone, but there are
ambiguities, like 1:30 am on a certain Sunday in November.
EXIF
The time information for a photo is stored in EXIF format in two separate
fields: DateTimeOriginal stores a naive time, and OffsetTimeOriginal stores
the UTC offset. EXIF does not store the time zone, only the offset.
Ideally, programs handle changing the time zone (which is mostly a display
preference) separately from correcting the absolute time. (Spoiler! They
don’t.)
Sony
The problem started with my Sony camera thanks to a bad user interface and, I
suppose, user error on my part. The camera has three separate settings:
“Date/Time,” “Daylight Savings [sic],” and “Area Setting.” I set it up in the
winter with the correct time and location, but set “Daylight Savings” to “On”
because I thought that it was offering to automatically adjust for DST. What it
actually did is set the offset to -0700, even though the correct offset in
the winter is -0800.
This means that all of the pictures I took got not only the wrong time zone,
but also the wrong absolute time. I first noticed it when my Sony photos and my
iPhone photos were in the wrong order in Apple Photos, despite the time seeming
to be set correctly on the camera. But more on that below.
The “correct” way to use the camera settings is to first set the location and
Daylight Savings setting, and then set the time. From then on, whenever you
adjust the location (because you traveled) or the Daylight Savings setting
(every March and November), it automatically adjusts the Date/Time in the
correct direction. If you forget to change either of these, at least you will
still record the correct absolute time. But if you ever try to fix the time
without first correcting the other two settings, you will end up with the wrong
absolute time.
Apple Photos
RAWs
Let me first note that if you import RAWs into Apple Photos, it does all sorts
of weird things to the time zones:
Inside the app, it seems that RAWs just use the naive time with your Mac’s
time zone.
If you use Export Original, you do at least get the actual original with its
EXIF offset intact.
If you export JPEG, you get the original’s naive time but with your Mac’s
time zone.
If you drag a RAW out of Photos onto a Finder window, it converts into JPEG,
and you get a naive time, with the offset time left blank.
It’s an absolute madhouse, but I generally don’t put RAWs into Apple Photos.
JPEGs
If you import JPEGs into Apple Photos, it sees the offset. The Info pane
shows only the photo’s naive time, but the photos are sorted according to
absolute time. This is how I first discovered that my camera settings were
wonky. I had a photo taken at 10:00 by my Sony that was sorted before one taken
at 9:30 by my iPhone, because one was stored correctly as 09:30:00-0800, and
the other was stored incorrectly as 10:00:00-0700, which is the same as
09:00:00-0800.
To fix the incorrect time zone in Photos, you can use the “Adjust Date and
Time” dialog box, which is in the menu bar or can be triggered by double
clicking the time in the Info pane. As far as I know, this is the only place
where you can see the time zone information. If you do change the time zone,
the dialog “helpfully” adjusts the time in the opposite direction, so that by
default (as long as you are operating on only one photo, more on this in a
minute) you are changing only the time zone and not the absolute time. But if
you counteract this and set the new time to the same as the old time, they you
can convince Photos to change only the time zone.
If you manage to set the absolute time correctly on your camera but forget to
adjust it for traveling or Daylight Saving Time, the assist from the dialog box
is exactly what you need. Just change the time zone and the fixes to the naive
time will be exactly what you want to preserve the original absolute time. Here
is an example where I changed from -0500 (set nonsensically in my camera as
Central Daylight Time even though it is winter) to -0800. The message says,
“The original photo will be adjusted by -3 hours” which is a little confusing,
but once you understand exactly what is going on, it will make more sense.
Where it all falls apart
So far, everything is mostly understandable once you figure out what is
happening, though the user interface remains confusing and is further
complicated by the fact that the time zone itself isn’t visible in the Info
pane. But it gets worse.
“Adjust Date and Time” can be and usually is used on multiple photos at once.
It would be far too many clicks to make the adjustment for every single photo
individually. It shows you the first photo and has you pick the new date, time
and time zone, and then it applies “the same” relative adjustment to every
photo in the batch.
If you are adjusting absolute times, a batch operation makes sense, and in fact
you can just alter all of the naive times and leave the offsets alone and this
would be technically correct, although with DST boundaries it could push some
times into offsets that are nonsensical for the time zones where they were
taken.
If your goal is to correct the time zones while preserving the absolute times,
it is also clear what the batch operation should be.
Unfortunately, Photos combines both time adjustments and time zone adjustments
into the same operation, and it will only do something sensible if all of the
selected photos have the exact same time zone to begin with. It also makes the
time-zone-only transformation difficult and error prone.
As far as I can tell, the batch edit is always performed as the following two
separate transformations:
The time zone is set to the new time zone for all selected photos
The naive time is adjusted by the same number of hours for all selected
photos
In particular, this means that there is no way to correct time zones while
preserving absolute times. That is, if you change the time zone for several
photos which are not all from the same time zone, you will adjust the absolute
times by varying amounts. Here is plausible scenario:
You get off the plane at 2 p.m. in New York
You take a photo marked 11:07:00-0800 because you forgot to change the time
zone
Better late than never, you fix the time zone in the camera, so the next
photo says 14:09:00-0500
Later, you select the two photos in Apple Photos and begin to batch change the
time zone to NYC. The dialog suggests changing the time to 2:07 p.m., which is
correct for the first photo. This is implemented as “change offset to -0500
and add 3 to the naive time.” The first photo gets a new time of
14:07:00‑0500 (good job), and the second changes to
17:09:00‑0500. Oops.
I use Day One a lot, jotting down thoughts or memories, or anything that I
might forget. It is often useful to be able to remember what I was doing
or thinking around a given time in the past.
I also use Letterboxd to track the movies that I watch. Sometimes it is
helpful to remember whether or not I have seen a movie and whether or not I
enjoyed it. It also helps manage my watchlist and gives me recommendations from
friends.
I thought it would be fun to combine the two, so that I can use Day One to
remember when or if I saw a movie. Letterboxd has RSS feeds and Day One
has a MacOS command line interface (you can install them from a menu item in
the app).
So here is a Python script to parse the RSS feed, download the poster, and
create a new Day One entry.
#!/usr/bin/env python3fromtempfileimportNamedTemporaryFilefromxml.etreeimportElementTreeimportreimportsubprocessimporturllib.request# Letterboxd rejects requests with the "python" user agent.defcurl(url):req=urllib.request.Request(url,headers={'User-Agent':'curl/7.64.1'})returnurllib.request.urlopen(req)# Returns None if it can't find an image.defextract_image_url(html):m=re.search(r'img src="([^"]+)"',html)returnmifmisNoneelsem.group(1)defcreate_entry(item):# Letterboxd uses a Custom XML namespace for some elementsns={'ns':'https://letterboxd.com'}# Extract the watched date. If there isn't one, I don't want it.o=item.find('ns:watchedDate',ns)ifoisNone:returndate=o.text# Get the title and year.title=item.find('ns:filmTitle',ns).textyear=int(item.find('ns:filmYear',ns).text)# Get my rating if there is one, and translate it into stars.o=item.find('ns:memberRating',ns)stars=''ifoisnotNone:rating=float(o.text)stars=' - '+'★'*int(rating)ifrating!=int(rating):stars+='½'# The RSS description has a poster image.o=item.find('description')image=oifoisNoneelseextract_image_url(o.text)# Prepare the Day One command.text=f'Watched {title} ({year}){stars}'command=['/usr/local/bin/dayone2','--journal','Media','--date',date,'--all-day','new']withNamedTemporaryFile('wb',suffix='.jpg',delete_on_close=False)asfh:ifimage:fh.write(curl(image).read())fh.close()command.extend(['--attachments',fh.name])subprocess.run(command,input=text,check=True,text=True)withopen('/path/to/state/file','r+')asfh:already_downloaded=set(fh.read().split('\n'))root=ElementTree.parse(curl('https://letterboxd.com/username/rss/'))foriteminroot.findall('./channel/item'):guid=item.find('guid').textifguidinalready_downloaded:continuecreate_entry(item)fh.write(f'{guid}\n')fh.flush()
I created a separate journal in Day One called “Media” for these, so that they
can be separate from my normal entries.
The GUID of each entry is written into a state file, so that I don’t download
anything more than once. I used 'r+' so that I can both read and write.
After reading the entire file, the cursor is at the end, which is where the
writes happen. It also requires that the file already exist, which I appreciate
because it means if I give it the wrong filename, it will crash instead of
creating the entire set of entries again.
NamedTemporaryFile creates and opens a temp file where the poster image can be
stored. By setting delete_on_close=False, I can close the file and it stays
around until the end of the context block. If you are reading closely, you may
notice that it creates and then deletes a temp file even if there is no image.
I’m okay with that.
ElementTree does a weird thing where the object returned by item.find()
evaluates to False even if it exists. This is why there are a bunch of
if o is not None instead of the simpler if o.
Lastly, the RSS feed only has about 25 entries. If you want older data, you’ll
have to get that some other way. Letterboxd doesn’t have a publicly available
API, but they will give you a CSV file, and each row has a link to the movie
page. With a little work (and don’t forget to set the user agent), you can
scrape a bunch more posters.
As a bonus, the image view in Day One gives me a nice looking table of my
watch history. Here is what I was doing about nine months ago.
It’s sad to see so many people going along with or cheering on Trump’s inhumane and cruel policies. I like to imagine that someday they’ll realize what he is and feel shame for their support, but who am I kidding? Even those who were on the side of racism during the civil rights movement never felt a bit of remorse.
Also going through my head this morning:
I hope you’re proud how you would grovel in submission
To feed your own ambition
So though I can’t imagine how
I hope you’re happy now
I’m trying a couple of new things on this site.
One is to be a little less guarded about myself. I’m sometimes afraid to
overshare and so end up overly reserved. I’ve started with a more comprehensive
and personal about me page.
I’ve been posting some photos using the Glass app over the last couple of
years, and I’ve now copied them here as well, along with
an RSS feed.
Since I last analyzed my electricity usage two years ago, several things have changed:
I had a 240V charger installed, which speeds up charge time and draws more power.
My electric company changed the XML data slightly, with an entry every 15
minutes instead of 60, and with entries for power returned to the grid. For
me, the latter are always zero because my house does not generate
electricity.
We replaced our family-hauling gas minivan with an electric vehicle, so now
there are two cars to charge. This happened at the end of the year, so
you don’t really see it in the data yet.
To extract the data in Python, using the built-in xml.etree module to convert
to a Pandas series, I reused most of the code from last time:
The main difference is the addition of [@rel='Delivered'] to filter to only
power delivered to me and not the other way around. I also added the
sort_index command, because for some reason the dates are not entirely in
order in the XML.
At this point, I wanted to determine when I was charging a car. Charge loads are
probably pretty easy to isolate, because they last for a long time and are
relatively constant. If I were trying to be robust, I would probably figure out
what expected power draw for a given time of year and time of day, and
then find out periods where the draw is significantly higher than that. Using
something simple like the median of the surrounding 14 days of a given time
would probably work, since I charge less than half of the days.
But in my case, the 7.5 kW of power that our electric Mini draws is more than
our entire house uses over any 30-minute period. There are five 15-minute
periods that reach that level, but these are relatively easy to filter out.
I wrote this code to compute the charge state. I wanted to separate it into
cycles of “off”, “start”, “on”, and “stop”. My thinking was that these “start”
and “stop” periods are probably times where I was charging the car for some but
not all of the 15-minute period. I used a threshold of 1800 Wh, which is 7.2 kW
over a 15-minute period.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
THRESHOLD=1800charge_state=pd.Series('off',index=ts.index)charging=Falseforiinrange(len(ts)):ifcharging:ifts.iloc[i]>=THRESHOLD:charge_state.iloc[i]='on'else:charge_state.iloc[i]='stop'charging=False# Look at the two entries after this one to see if# they are both above the threshold.elifall(ts.iloc[i+1:i+3]>=THRESHOLD)andi+1!=len(ts):charge_state.iloc[i]='start'charging=True
Line 2 creates a new series with the same index as our time series. We then
look at the entries one by one and determine when to transition to “start”
(Line 13, if we are not already charging and we see two upcoming entries above
the threshold), when to stay “on” (Line 6, as long as we stay above the
threshold), when to transition to “stop” (Line 8, as soon as we first go below
the threshold). Note that Pandas uses iloc to look up an entry by integer
offset, rather than by time.
With this charge_state series, it is easy to play around with the data. For
example, to count how many charge sessions:
sum(charge_state=='start')
To look at the entries where usage is high but you aren’t charging. This means
“filter ts to points where it is above the threshold but charge_state is off.”
ts[(ts>THRESHOLD)&(charge_state=='off')]
Finally, a good visualization is always helpful to understand the data.
I don’t usually use much ChatGPT while programming, because at work I am
usually dealing with complicated code that I don’t want to mess up. But it is
impossible to remember how to do anything in Matplotlib, and I confess that I
asked ChatGPT for a lot of help, and it did a pretty good job most of the time.
Here my goal is to draw dots at start and stop time for each charge, and
connect them with a line. I really just have three arrays here:
start_time is the full date and time when I started charging. This is used
as the x axis.
start_hour is the time of day when I started charging.
charge_hours is the number of hours that I charged.
Note that since charging often happens overnight, I’m using the end time as
start_hour + charge_hours, which might be greater than 24, but I think that makes a
better visualization than wrapping around to the bottom.
importpyplotaspltstart_time=ts[charge_state=='start'].indexstart_hour=start_time.hour+start_time.minute/60charge_hours=(ts[charge_state=='stop'].index-ts[charge_state=='start'].index).total_seconds()/3600fig,ax=plt.subplots(figsize=(15,8))# The barsbars=ax.bar(start_time,charge_hours,bottom=start_hour,color='b')# Draw the number on top of the barax.bar_label(bars,labels=[f'{h:.0f}'forhincharge_hours],padding=4)# Bottom dotax.scatter(start_time,start_hour,marker='.',color='b')# Top dotax.scatter(start_time,start_hour+charge_hours,marker='.',color='b')# Add a grid and make it denser than it wants to be.plt.grid(True)plt.yticks(np.arange(0,29,2));
And here is the final result. The Mini only has a 32 kWh battery, so can always
fill in four hours or so. The longer lines from December are for the new car,
which has triple the battery size, but also can max out the 50-amp circuit that
my charger is on by pulling 9.5 kW. (If you do the math, that is only 40 amps,
because code require that a continuous load uses only 80% of the rated amperage.)
The Mini used to be scheduled to start at 11 p.m., because it draws 30 amps on
our 100 amp service, and I was afraid that if I charged it while everyone was
still awake it might trip the breaker. In November, I decided to stop babying
our house, and scheduled the charge to start at 9:15 instead. Cheap
electricity starts at 9:00.
Also, a quirk of the Mini app is that if you plug it in on a Saturday night
(the way my schedule is set), it won’t start charging until Sunday morning at
3:00. That is a long story for another time.
The back side of a clock from inside Musée d’Orsay. In the distance, the Louvre.