CV as Yak Shave

by GenericJam

Tags: wasting, time, old, fashioned, way

or

Mouse in a Maze

or

Bash Your Head Against Every Wall Until a Door Appears

or

How to Make the Most Expensive Free Tier

or

Repeatedly Falling on My Face for Fun and Profit

Part of the reason for writing this synopsis is an antidote to the articles with all the answers as if the process to completion was a straight line. It is rarely a straight line at least for me. Sometimes it feels like 95% frustration with 5% gratification where I momentarily get to see the horizon before heading into the next valley of frustration. If you’re in the valley, you’re not alone. We just can’t see each other because we’re in our own valleys.

The Spec…. Sounds Easy Enough

I recently found myself looking for work (in exchange for money) again so it was time to update my CV (aka resume). My previous one was written in React circa 2018 which is basically the stone age in Javascript time. Heroku had abandoned it to the sands of time. I looked into resurrecting it for nostalgia’s sake but the error messages made me feel like that was probably a waste of time given that I wanted to rewrite it in the new hotness of Phoenix LiveView.

I didn’t have any better ideas so I decided to recreate the React version with a few new bells and a single whistle. The original featured a banner with my contact details and my picture and a series of defaced pictures of me in a side panel on the right. I created those series of images using PIL by messing around with color values per pixel. I had recently written an article on image processing in Elixir so I had a pretty good idea of how to do something in Elixir and even create the images live for a truly random one off experience. I seem to have a thing for random data.

The new idea is that instead of just having a single mutated image, a series of images is continuously pumped out so the user sees the image devolution process. A feature of the original React app was that if you click on an image, the color scheme of the page changes to something like the image. I wanted to use this again but the color scheme change possibilities change as the image changes so it creates a unique experience every time a user interacts with it.

Me but not as I know it! An image I produced in testing. This is the general idea.

The Build

Building in LiveView is quite fun and moves pretty quickly. Within a day I had the structure built and I was moving on to morphing the images. The only thing slightly out of the ordinary was I introduced a GenServer to manage the image morphing. That’s probably where the troubles started for the deployment.

I didn’t know how to show dynamically generated images but I knew that Livebook did so I inspected an image generated in Livebook and the whole image is a base 64 encoded string which is super long. This feels like a good fit for LiveView as it lets us update the image and the HTML at the same time instead of two server calls like it would be with a conventionally served image.

Apart from minor developmental hiccups it was pretty smooth sailing. It worked on my machine so that was good enough. The idea of dedicating 13 GenServers to each user which each were managing an Nx build pipeline was kind of gnawing at the back of my brain. “I’m sure it will be fine as long as no more than two people at a time view it.” The lies we tell ourselves.

The Deploy

The previous version of my website was already running on Fly.io so the plan was just to redeploy there. When I deployed, it failed immediately. I suspect this was due to the LiveView component which I wasn’t using previously. I had to sort out some cross origin issues so I followed the guidelines that I found and bounced to the next error.

After this fix it kept failing for a series of reasons which worked on my machine but failed in the deployment. Initially I was loading the original image from within the app but it seems as though all these ‘extra’ files get stripped out in the release. I couldn’t figure out where they were stored. No bother, I’ll just hardcode the image as a string in a module. That got me to the next error which was Nx x 13 gets OOM killed rather quickly with 256 MB RAM on the free tier. In retrospect, this is a bit obvious.

Back to the Drawing Board

So it looks like the completely dynamic experience of generating the images on the fly is out. Maybe I can just cache the images somehow. Maybe I could also store the images as a series and then just iterate through the series. If there were enough images in the series the illusion of randomness could be maintained. The images all have a similar vibe anyway. The sweet spot for the number of images looked like 120. There are 13 displayed on the page. So if there are 200 sets then that’s probably enough variation that it’s indistinguishable from live generation. In retrospect 200 is overkill and life would have been a lot simpler with half that number. That’s called foreshadowing. The only sensible way to deal with these was in a database as the strings are super long and unmanageable but it’s totally fine if I never have to deal with them directly. Even contained in a DB it’s pretty unwieldy for other reasons as I was about to find out. The perils of haphazardly producing extra data because it’s ‘free’.

The Phoenix context generator made it super simple to generate the context for saving the images which included the schema, the migration and the handling code. That was a big help that saved a couple of hours of typing.

The easiest thing to do was to let the thing run with the images still being generated but put them in the DB when they show up. I ran it and realized I should restrict it to one set of images at a time as having all 13 run was interleaving them in the DB. The simplest way for me to serve them was to retrieve a series and then walk through it so it would be a lot simpler to just have the same series in a contiguous block. This turned out to be one of the singular good ideas I had as I only needed to have an offset location and bump the ID each time. At the price of being a bit fragile this worked great. I created a field to identify which series the image belongs to but by using the offset I don’t need this.

This Should be Done in No Time

That took an hour or two to generate 24,000 images. Great! Now all I have to do is fix the code so it pulls them from the DB instead of generating them with Nx. I also needed to somehow get my dev DB onto the Fly instance. I thought putting the records into seeds would be best so the DB would get populated when the DB gets reset. I did a backup of the local Postgres DB which was 1.5 GB (!). Hmm, this is a bit unwieldy. I had planned on copy pasting the generated script into seeds but 1.5 GB is too big for most editors to deal with. Copy pasting the super long strings wasn’t going to work that well. Also, adding 1.5 GB to the git repo seemed like a dumb idea so I decided to migrate the data by other means.

Maybe the quickest way would be to rerun the same code I ran on my machine on Fly and just generate the images again. I thought there has to be a more technically savvy way to do that and I thought even running sequentially, Nx would probably overrun 256 MB of memory.

It’s Hard to See the Rabbit Holes Coming

Hello Pete

This led down other rabbit holes like how to use flyctl and the rest of the suite of Fly terminal controls as well as trying to use the online dashboard and the online Fly community questions. After a couple of hours spinning my wheels I thought I was onto something using pg_restore via this community post. Now I just need to figure out what my username and password and url were. After further hours of digging I found out how to remotely log into iex which lets me discover the connection string via env vars. Later I found you can do the same thing from ssh by getting another env var there. Protip: when using the fly terminal tool, if you’re in the directory of your fly project it will use that one by default but the DB doesn’t have a project folder so you need to use -a your-app-db to point it at the DB app.

Meanwhile, I also figured out how to proxy the db to my machine and I tried connecting to no avail. I didn’t know what username or password to use.

Once I had the connection string, I was able to fill in the blanks and connect to the DB. It looks deceptively like progress. When I tried to run pg_restore it kept failing after running for a few minutes. There were some errors in the web dashboard that I didn’t really understand. I should change some Postgres setting that I have no idea if I even have access to through Fly. One step forward and two steps back. It looks like restoring the Fly db with my local backup Is not going to work.

Before I gave up on the concept of remote restoration I tried to get my local BEAM node talking to the Fly BEAM node but I couldn’t get them to connect. If I could do that I could run the database commands locally that would feed the items into the remote DB.

Stupid Ideas that Work

I returned to my previous unenlightened idea to just recreate the images on the Fly server and I resolved to solve part of the problem by paying for more RAM and increasing the DB size if need be. The DB instance was 1 GB and my backup file was 1.5 GB. I wonder what happens when you run the DB beyond the edges of the box? Maybe it figures out that it needs to resize itself? It does not. It just locks up and becomes completely unresponsive. Now I needed to figure out how to resize the DB instance and get it back on the track. Unfortunately, my previous run only put in ~10k entries before it grew to the size of its coffin and died. I deleted all the entries but Phoenix somehow knows where the IDs were and it restarts there. Now I needed to reset the DB back to vanilla. Thanks to this post, problem solved.

Several hours later all 24,000 images were created. Now all I had to do was rollback the code and redeploy the final working version which I did and it finally worked!

Let’s Land this Mixed Metaphor

Its Art! Surprisingly good example of a muddled message by Dalle 2.

Sure, there probably were several easier, smarter ways to do this idea. For example, there’s this cutting edge thing called a GIF that does animation all by itself and it supports transparency. I still would have to shoehorn the changing color scheme in there somehow but GIFs are probably a reasonable alternative to what I ended up doing. Part of what I was trying to accomplish was working with the technologies I want to be hired for. No one is looking for GIF farmers.

I was tempted so many times to redefine ‘done’ to ‘what’s done now’ but I decided to persist because I guess I like the torture.

This CV project was successful in that I was able to rebuild a previous React project using LiveView and even bolt on Nx into the project. Remember kids, if at first you don’t succeed, keep failing until you do or if you get tired of failing just move your definition of success until you do succeed.

Keep smiling even if you’re in the valley right now!