Embedding Medium posts on my website

A tutorial and short story about the beginning of my journey using Medium

I’ve recently decided to use Medium as a blogging platform. I chose it for the great combination of effective reach and quality reading experience it provides. Especially on the distribution forefront, Medium seems to be way ahead of other platforms I’ve tested, even if you’re new to it and do not have your network of readers formed yet. This is most likely due to it being a mature product, profiting from a combination of network effects and improved recommendation algorithms.

From a writer’s perspective, bootstrapping an audience is essential and poses a substantial challenge. To solve it, you need a combination of great content and discoverability, both of which aren’t precisely straightforward puzzles to solve. Creating great content by itself implies an effortful and time-consuming activity. So having a platform like Medium helping you on the audience front comes in handy.

On the other hand, choosing Medium as a blogging platform has its downsides. One of them is that you lose flexibility regarding your personal brand on the web. Internet professionals and hobbyists alike are used to the idea of having an exclusive corner on the web to experiment and express themselves. A place where they can develop their digital identities, and forge their brand, be it for fun or profit. As a matter of fact, this empowerment of individuals made the internet such a successful communication channel in the first place. The idea that google.com or yourownwebsite.com are equally available for most internet users is revolutionary.

As a software developer who built a career working with the web, I, too, own a humble place like that. And although this place is now just a modest home page with a sentence and a couple of links, I have always had big plans for it. I envision a future where I can transform my website into a true platform for expressing myself creatively and connecting with others—a digital manifestation of many ideas I have and that I would like to explore.

This poses a conundrum. On the one hand, you have the independence, freedom, and control of owning your own place. On the other, you get a specialized infrastructure and a network of readers ready to consume whatever you write. At first glance, this might feel like a paradox. But I don’t believe it is. One can forge an independent space of their own on the web while leveraging a platform like Medium to help your content to reach more people than you would be able to do on your own. But to make them work together, you need to build a process around both platforms that help them feed into each other.

With that idea in mind, I decided to sit down for a bit and work on what I understand was the first baby step to make my broader vision start to take shape. Taking for granted that I will be focusing on writing on Medium for a while, I wanted to find a quick way to show the work I’m doing here on my own website. Of course, I could add a link suggesting people visit my Medium profile, but that wouldn’t be enticing enough. I needed to show the content there. From this need, a question arose:

How could I show the content I create on Medium on my own website?

The Quest

My first straightforward idea was to manually update my website every time I posted something new on Medium. I know I won’t publish new content that often, so this would be an OK solution. But as a software developer, I’m rewarded from day to day for automating monotonous tasks like that. So much so that automating has become part of an instinct. The slightest scent of tedious repetition triggers the automation animal inside me to come out and look for opportunities. It wouldn’t be different this time, so I had to figure out how to automate this process.

My current website setup is relatively simple. I host a Github page on a custom domain, leveraging a trivial template engine to generate the Website files. To update it, I handle the HTML template and CSS directly and recompile them. So my technical solution would need to fit this level of simplicity. This means that I’m excluding the option of leveraging a full-fledged static website generation tool for the time being. After all, all I have is a single HTML page with some links and a sentence.

In the end, my solution would also have to do 3 things:

  1. Get my Medium posts;
  2. Update the statically generated website with it;
  3. Automatically do it once I publish something new.

With that in mind, I started a short adventure looking for the solution, with a time constrain of one hour to have it up and running.

1. Getting my Medium posts

And learning from how Medium gets Medium posts.

The first step was to figure out how to get my posts from Medium in an automatable way. Fetching data from a digital platform nowadays usually means consuming data from an HTTP based API endpoint. But I quickly found out that that’s not the case with Medium. Their current API is limited and does not provide any way to list your posts. For a platform of this size, I found it somewhat a surprise at first glance. Even considering Medium’s grounds for some closeness, providing a limited list of published posts for developers to build on top could still be mutually beneficial for both sides. Moreover, software development seems to be a relevant niche community inside the website.

While reflecting on why such a limitation, I got curious to know what Medium used behind the scenes to display content on the platform. First of all, they use React. Different things can indicate a React Front-end, but an effortless way to verify it is by checking the website with React’s Developer Tool. If the extension is active when you visit a page, it usually uses React somewhere. In Medium’s case, it is the whole web page. Apparently, they render those webpages at the server-side and then move the client’s control once everything is loaded, a common approach nowadays. Given Medium’s take on User Experience and the importance of SEO for them, nothing in the technology or the approach came as a surprise.

Examining further, I started to look at Medium’s network activity while navigating through the website. Looking at the requests done while going to my Medium profile page as an example, I noticed they use Graphql to fetch that page’s data. While not a big surprise for a complex React application, seeing that they chose Graphql for getting data hints why their developer-facing API might lack some features.

Although possible to overcome, maintaining both a public RESTful and a private Graphql API side-by-side can pose a challenge, especially in the scale I imagine we’re talking about when discussing Medium. Add this simple technical intricacy to the broader and equally complex business discussion, and you can grasp why adding even basic new endpoints to their API would require meticulous scrutiny. Of course, this is just speculation, but the thought of it did take away a little bit of the frustration I felt with their API limitations.

But while diving into the waterfall network requests, something else caught my attention: the size of some of the GraphQL queries their client was doing. One query that listed stories for a given profile page, for example, was sending a staggering 32,9 KB of payload. Looked alone, this number might not feel so big, but that is a lot of data for a network request of this kind, even considering GraphQL’s peculiarities.

Screenshot of Chrome’s Developer Tools Network Pane showing the Request Payload

On a closer examination, I could see it was all due to their extensive usage of Graphql fragments. Their GraphQL query fragments follow a naming convention that hints at a per-component approach to building them. For example, there’s one called UserAvatar_user, probably the fragment definition for the UserAvatar component, querying information on the user type. Some Graphql clients like Relay emphasize this fragment-focused approach, and there seem to be genuine benefits in terms of performance and maintainability for adopting it. For instance, leveraging fragment caching and avoiding query duplication. Regardless, it is this approach that leads to the big payloads when querying Graphql data.

I now knew that Medium uses React and Graphql, probably leveraging a tool that allows them to take a fragment-oriented query composition approach to their front-end. But this was completely irrelevant for the task I had at hand. So after taking a couple of notes on topics to research later, I got back to my main task.

Googling for “how to list Medium posts on your website,” I quickly learned that everyone was recurring to RSS to overcome the lack of an API to do so, meaning I could do the same. Some of the solutions I saw listed used some proxy to get the RSS Feed in JSON format, making the data easier to handle. I decided to get the data directly. I found a little JS library called rss-parser that handles RSS parsing. I added it to a tiny script I could later run when building the website, and I was done. The script looked like this:

My idea was to save this JSON file and use it as a data source for generating the updated files for my new website later. But before I could move on to the Updating The Website step of my task, I noticed something peculiar: Medium treats comments (or responses, as they call it) as stories!

That was a bit counterintuitive. I’d expect the stories I publish and the comments I make to be two completely distinct entities. But when I fetched my Medium RSS feed, I noticed all my “Yes, that’s great!” comments mixed there as well. So I took another sideway to understand the matter a bit more.

The most prominent discussions I found on the topic point out how the platform’s UI is misleading. Although Medium makes no big distinction between stories and comments, the response section of stories still resembles way too much of what a regular comment section would look like. A fair perspective, but I was sure that there was way more to it.

Digging a bit more, I learned of a broader discontent with how discussions occur on the web. The main point being that comments as we know them might bring more bad than good to the stories they support, as they are more often than not a place for spams and trolls, as opposed to being a space for in-depth discussions and reflection.

Medium claims to have been born from the desire to enable a digital reading and writing experience against those hasty tendencies. So I can imagine a mix of vision and priorities shaping the current UI and UX of the platform’s comment section. Maybe what we’re seeing is the intermediate step between regular comments and thoughtful discussions. But again, I digress.

For the time being, since I didn’t want to show all those “yes, you’re right!” stories display on my website, I deleted all of them. As a matter of fact, they were not adding value to the discussions they were in. And for the future, I’ll keep in mind that if whatever I have to say about a story is worth its value, it must be a story on its own.

2. Updating the website

Updating my website with the provided data was equally straightforward. As I’ve mentioned, I statically generate my website with the help of a template engine: Pug (it used to be called Jade when I started using it). Pug has a NodeJS API I could use in my build script to generate the HTML files for my website directly. With this API, I just needed to make sure my template would handle the Medium posts, and I would be close to achieving everything I wanted.

On the HTML template side, I added a new section to my home page that would generate a list of my Medium posts, which in Pug syntax terms meant the following:

On the building side, I had to programmatically compile the template with the fetched data using Pug, replacing the step where I saved the JSON file. The result looked like this:

And that was it. To update my website with my latest Medium posts, I just needed to run this simple script — a solution within my simplicity and ease of maintainability constraints. There are some important considerations regarding Medium’s RSS user feeds that I’ll get back to later on, but the important thing is that the second step of my task was done.

3. Updating the website when new posts are published

With the website update script working, the last step was to figure out how to update it automatically. In an ideal world, as soon as I posted new things on Medium, something would trigger my script, making sure I only run it when needed. But given Medium’s API limitations, I was sure they wouldn’t provide me any tool to help me in the task.

Some ideas to accomplish this one-post-one-update approach came to mind, but they all required more effort than I was willing to put. So I decided to go for a scheduled process: running my build script on a timely basis. This is a task for a time-based job scheduler, and I knew Github Actions had scheduling capabilities. It seemed natural to pick it to perform the job, as my codebase and the actual website were already hosted on Github. Having everything in the same service is a big plus when you want to keep things simple.

My “Update Website” workflow would have to take the following steps:

  1. Checkout the latest code from the website repository;
  2. Install all node dependencies;
  3. Run the update script;
  4. Commit and push the updated files so that Github pages would update the live website.

A Github Workflow is basically a list of steps you ask a remote machine to accomplish for you once something happens instead of doing it yourself. So all I had to do was to translate my list of steps into Github Workflow steps. A good thing about using their service is that you can benefit from the community built actions for most use cases, making the process way more manageable and speedy. My need was no different, and I could achieve what I wanted without that much after composing other already built community actions.

To set up a Github Workflow, the first thing to do is tell it what environment you intend to run your job. Github provides Ubuntu, Windows, and Mac environments for you. In my case, I chose Ubuntu. You tell GH’s runner which of those you want it to use by adding the runs-on key under your job with the desired value. Notice that every job has a unique id associated with it. In my case, I chose update as the identifier for the job.

Next, you want to tell which steps the job is going to take. In my case, this was checking-out the repository code. As I mentioned, most of those routine tasks will be available from Github’s community. For checking-out code from your repository, Github itself provides the actions/checkout action. It will take care of downloading your code and making it available on a specific path so that other steps can access it.

Steps are described under the steps key, and for using external steps, you add their id as values to the uses property. There might be different versions of a given action available. You can target different versions of Actions or even point to specific commit SHAs if you need. Just append the @ followed by your target after the action id. In my case, I’m using actions/checkout version 2. Finally, you can also assign a name to each step if you want.

Next, you need to prepare the environment that your job will be running. In my case, I only needed to make sure NodeJS was available. Github also provides an action for that, actions/setup-node, so all I need to do was add it. Some Actions allow or even require you to configure specific attributes for them to run correctly. In the Setup Node Action case, you can set which version spec of Node.js you want to have available. You pass configurations to an action under the with key. In my case, I set it to run NodeJS version 14, the latest LTS release and also the one I used locally to test my build script — so I know everything works fine.

I now had the latest code from the repository available and a working NodeJS environment to run it. Next on the list is installing dependencies and running my build script. Given I have a working Node environment and my code available, installing dependencies could be achieved by running npm install from the repository’s root path. This would work fine, but some extra configuration would be needed to benefit from npm’s cache. I don’t have that many dependencies on my website project, but on most other projects I work I do. Because of that, I got used to using the bahmutov/npm-install Action to handle dependencies. It allows you to install npm dependencies leveraging cache without any extra configuration, which can significantly improve performance.

With my environment ready, I just needed to run my build script and commit the changes to my repository. To run commands directly, you can add the command you intend to run as a value to your step’s run property. In my case, I added the code snippet I built to a package.json script, so all I had to do was call it in the following step:

If I publish a Medium post and run this script, a new index.html will be generated. If it runs, but no recent posts are found, the newly compiled HTML file and the one already on the repository will have the same content. In the first case, I want to commit the changed index.html file to the repository, while in the latter, there’s nothing left to do. This is the trickiest step in the workflow since programmatically manipulating git trees isn’t a simple task. Again, the power of a large community comes in handy. After a quick research, I found an Action that would allow me to accomplish what I intended: git-auto-commit-action:

With this, my job was ready. All that was left was to tell Github when to run it. To describe what triggers a workflow, you can set different values for the special on workflow attribute. You can start a workflow after many kinds of events, such as new pull requests, commits, issues, and so forth. For my use case, I wanted to trigger regularly. For doing it, I used the Schedule event. You define the schedule using Cron syntax. Because I can never remember the syntax, I always go to crontab.guru to build the correct entry. This left me with the following:

Every day at 0:00h UTC Github will run the workflow that updates my website with my latest Medium posts. As a final touch, I added another trigger to allow me to run this workflow manually. The event is called workflow_dispatch. The final workflow:

And with it, my task was done. I was already familiar with Github Actions so setting up this workflow wasn’t that much work. From my perspective, all solution requirements were met. I definitely plan to improve this later, especially on the styling side of things, but overall I’m happy with the result.

Conclusion

My solution turned out simple enough while still holding a degree of sophistication, made possible especially by Github Actions. While working on it, I ended up learning a bit more about Medium on some levels, and I have a couple of topics I’d like to dig into later as a result. I could feel the challenges posed by choosing a platform I do not entirely control to share my work. However, I’m still confident that with creativity and the right set of tools, one can build its own space on the web while leveraging others’ platforms for helping on the task.

Learning to code, to think, to fast, and to wait — writing thoughts https://vicnicius.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store