I've never had a blue checkmark on Twitter so I can't say what that process is like. Looking at it from the outside though it seems the process isn't simple or straightforward. My experience with verification has been with other applications such as Mastodon and IndieAuth. In both of these cases, the main verification techniques use your domain and/or the rel=me link type. Although with Mastodon you can add multiple verified links to your profile, verification on those external domains pose challenges which include but are not limited to:

  • You can't edit the HTML directly so you can't add a rel=me link.
  • You can only add one link, so you're forced to choose between your website and Mastodon.
  • In scenarios where multiple links (i.e. e-mail, webpage, Twitter) are supported, Mastodon isn't one of the options

So how do you overcome these challenges? By using your domain.

A few months ago I wrote about owning your links. The general concept of owning your links is, using your domain to redirect to other properties and profiles you own. For example, instead of telling people to visit my Twitter profile at twitter.com/ljquintanilla, I can instead point them to lqdev.me/twitter. Not only does my domain become the source of truth, but in general it's easier to remember compared to usernames and proprietary domains. Since my site is statically generated, I own my links by using http-equiv meta tags. Although the only job for these pages is to redirect to the specified site, in the end they are still HTML pages. This means, I can embed any other HTML I want, including rel=me links. By including the rel=me link in the redirect page, you satisfy the requirements of applications like Mastodon because that's the page they'll land on before being redirected to the target site. As a result, I can verify my GitHub, Twitter, or any other online property I want with Mastodon using my own domain without having to make any changes or overcome any of the challenges imposed by those other platforms. What you end up with is something like this:

Screenshot of lqdev profile verified links

Before signing off, I'll note that this isn't a perfect solution and technically anyone could do the same thing on their own domain in an attempt to impersonate you, so just be aware of that.


Soon after .NET 7 was released, I upgraded the static site generator I use as well as the GitHub Actions that build and publish my website. Having upgraded last year from .NET 6, the process was as smooth as I had expected with no code refactoring required.

When it comes to development environments, for quick status updates like the ones on my feed or minor edits, I've been using github.dev. However, there's been times where I've needed to run and debug code to confirm that my changes work. This is where I hit some of the limitation of github.dev which means unless I set up a Codespace, I have to save my work and move offline to my PC. Codespaces are great, but given that Codespaces are nice to haves not a requirement for my workflow at this time, it didn't make sense for me to pay for them. That's why I was excited to learn that GitHub is providing up to 60 hours per month of free Codespace usage to all developers. That's more than enough for me.

Blog post authored in GitHub Codespaces with integrated terminal open

By default, Codespace images come preinstalled with the .NET 6 SDK which makes sense considering it's the latest LTS. However, since my static site generator targets .NET 7, I had to configure my Codespace to use .NET 7. This was just as easy as upgrading to .NET 7. All I had to do was provide the .NET 7 SDK Docker image as part of my devcontainer.json configuration file. From there, Codespaces takes care of the rest. As a result, I can now run and debug my code all in one place without interrupting my workflow.

PS: This post was authored in GitHub Codespaces πŸ™‚


It's been great to see many old and new friends joining Mastodon and considering alternatives to online communities over the past few weeks. Selfishly my feed had been quiet since I started using it and self-hosting back in 2018 so it's nice to see more activity. As folks have started to settle in and get a better understanding of Mastodon, I'd like to point out that Mastodon is only one of many apps and services on the Fediverse. Now I won't try to explain the Fediverse but in general, it's the collection of applications and services that interoperate in a federated manner, Mastodon being one of them. If in general you like Mastodon and are looking for alternatives to services like Instagram, YouTube, SoundCloud, the Fediverse has a few options:

You can create an account for each of these services using different instances, just like Mastodon. You also have the option of self-hosting which I find amazing. However, because for the most part these services implement the ActivityPub protocol, it means federation extends beyond Mastodon. You can follow people on PixelFed, PeerTube, and many of these other services using your Mastodon account which means you don't have to create accounts for those services just to follow people. While I don't have experience with all of these services, I recommend checking them out. I especially like what Pixelfed, PeerTube and Owncast are doing but those are the main ones I have experience with.


Last year, I posted about upgrading this website to .NET 6 and how easy that process was. I just had a chance to upgrade to .NET 7 and the experience was just as smooth. All I had to do was change the version in two places:

GitHub Actions *.yml file

dotnet-version: '7.0.x'

*.fsproj file


I'm happy that with such a simple change, I can get all the benefits of .NET 7 without doing a lot of work.


Slow start but good bounce back by Penn State after two straight losses. This game also gave a good glimpse of the future class and it looks promising especially at the QB and RB spots. If they can stay healthy, they may be playoff contenders in a couple of years.



Hoping for a good game.

Michael Scott ready to get hurt again GIF


Defense is clutch. As always in big games, they'll have to work overtime just to keep the offense in the game.


Progress...at least there was no turnover on the last series.


Undetaker rising from dead GIF


Football player they had us in first half GIF


Sweating GIF


Going for it on 4th & 2 feels like that 4th & 5 from a few years ago.


Defense showed up today. Can't expect to win a game with 3 turnovers.

Spongebob head out GIF


This is hard to watch.

DJ Khaled Another One GIF


I can probably count the number of times I've used the Azure Cloud Shell inside the Azure Portal with one hand. However, I recently had a need to use it and was impressed by the experience.

I wanted to create an Azure Function app but didn't want to install all the tools required. Currently I'm running Manjaro on an ASUS L210MA. That's my daily driver and although the specs are incredibly low I like the portability. The Azure set of tools run on Linux but are easier to install on Debian distributions. The low specs plus the Arch distro didn't make me want to go through the process of installing the Azure tools locally for a throwaway app.

That's when I thought about using Azure Cloud Shell. Setup was relatively quick and easy. It was also great to see the latest version of .NET, Azure Functions Core Tools, and Azure CLI were already installed. Even Git was already there. After creating the function app, I was able to make some light edits using the built-in Monaco editor. When I was ready to test the app, I was able to open up a port, create a proxy, and test the function locally. It's not a high-end machine so I won't be training machine learning models on it anytime soon. However, for experimentation and working with my Azure account it's more than I need.


Last I heard of Tumblr, they were owned by Verizon. While readying the book Indie Microblogging by Manton Reece, I learned (and later confirmed) they were owned by Automaticc, the company behind WordPress. Out of the the different homes it's had, I can't think of a better place for it.


Terrifier 2 did not disappoint. 100/10. Can't wait for Terrifier 3.

Leo DiCaprio clapping


Finally got a working prototype. A few days ago I added the request and webmention verification components.

I just finished creating the web API and Azure Table Storage components. Now I just need to run some additional tests and deploy it πŸ™‚

Chriss Pratt Rubbing Hands GIF


It's been a year since I created this feed and started posting mainly on my website. Since then, I've:

  • Used a shorter domain to redirect to my website domain.
  • Created VS Code snippets to simplify metadata tagging.
  • Used github.dev as the main interface for authoring and editing posts. For longer posts, I use VS Code locally but for these smaller microblog-style posts, github.dev makes it really easy.
  • Syndicated posts to Twitter and Mastodon.
  • Created a response feed for interactions like replies, reshares (repost), and likes (favorites) with support for sending Webmentions.
  • Created RSS feeds for each of my feeds.

It hasn't happened overnight. Instead, it's been the result of small incremental efforts over time.

Overall, I don't think I've posted any more or less than I do on other platforms.

What I have enjoyed the most though has been:

  • Learning.
  • Building the tools and processes to author and share content.
  • Owning my content. My website acts as the single source of truth.
  • Not requiring others to have or create accounts on individual platforms to access content.
  • Choosing how I author content. I'm sure this post is way over the 280 character limit and if there's a typo, I can just edit the post and republish the website πŸ™‚

Going forward, I plan to:

  • Update RSS feeds to include post content, not just link to the post.
  • Accept Webmentions. I'm about halfway done with the main parts of my implementation.
  • Implement tags for easier discoverability / search.
  • Consolidate metadata for posts (articles, microblogs, responses).

Success! I just partially implemented Webmentions for my website. Although I haven't figured out a good way to receive Webmentions yet, I'm able to send them. Fortunately most of the work was done, as detailed in the post Sending Webmentions with F#. The rest was mainly a matter of adapting it to my static site generator.

Below is an example of a post on my website being displayed in the Webmention test suite website.


Webmention on lqdev.me


Webmention displayed in webmention test suite

What does this mean? It means I can comment on any website I want, regardless of whether they allow comments or not. As if that weren't enough, I have full ownership of my content as my website is the single source of truth. As a bonus, if the website I comment on supports receiving Webmentions, my post will be displayed on their website / articles as a comment. The next step is to handle deleted comments, but so far I'm happy with the progress.


Very cool!

New (LTE) Rotary Phone


Same... πŸ™

Praying on Ohio State's downfall

β€” Barstool Penn State (@PSUBarstool) September 3, 2022

Source: https://twitter.com/PSUBarstool/status/1566230451378950144


Today I Learned about the VS Code Live Preview extension. While reading the release notes for the latest version of VS Code, I noticed a note on the Live Preview extension. Usually to preview static content, I use Python's HTTP server with the python -m http.server command. This extension simplifies the process because it's built into Visual Studio Code. If you do any kind of web development, it's definitely worth a look.


College football is definitely back. I don't miss those 4th quarter heart attacks, but a win is a win.


Very cool! QRZ just announced the New Ham Jumpstart Program, a new program to make it easier for new ham radio operators to quickly get on the air.


Things are heating up in the AI generative model space. OpenAI just announced Outpainting. A few days ago, Stability.ai released Stable Diffusion and recently a painting generated using Midjourney won first place in the digital category of the Colorado State Fair fine art competition.


I know what I'm doing October 6th.

Terrifier 2


Can't believe I hadn't heard about these podcasts before:

Love the horror anthology vibes like Tales From The Crypt and Freakshow for Radio Rental. For Dark Air, I really like the Space Ghost Coast to Coast and actual Coast to Coast feel. Best of all, the host Terry Carnation is Rainn Wilson. Even Terry's website has a great 90's web feel to it. Looks like Dark Air has been off the air for some time, but hopefully Radio Rental keeps publishing content.


Apple just announced they're discontinuing the iPod Touch. I'm surprised it hadn't happened earlier! Still, it's sad to see the end of an era for these devices which are effectively smartphones without the phone / cellular data connectivity. I still have my iPod Touch from 2014 that I use occasionally for FaceTime which aside from poor battery life and being stuck on an older version of iOS, it still works smoothly.

For the past two years, my audio solution has been a dedicated MP3 device. Since most of the content I consume I download and listen to offline, a dedicated MP3 device doesn't drain my phone's battery or take up storage space.

I started with the FiiO M5, which is like an iPod Shuffle. The device was great, but all it took was one drop for the screen to crack at which point I upgraded to the FiiO M6. The nice thing about the M6 is that the OS is a stripped down version of Android meaning you can side-load applications such as Spotify. The bad thing, the hardware can't handle (even heavily optimized versions) of these applications making the user experience frustrating. The touchscreen interface is great for navigation especially when you have large collections. However, since I only use the device for podcasts and audiobooks and don't get the benefits of installing streaming music apps, I'd be better off with a more affordable device like those from SanDisk.


It's been a little over a year since I started hosting my website on Azure. The resources I use include:

  • Azure Storage: Where all my site assets are stored.
  • Azure CDN: Content delivery
  • Azure Logic Apps: Event-based jobs to syndicate new posts on Twitter and Mastodon.

I haven't tracked traffic for the entire year but since October of last year the site has gotten roughly 5,000 visitors.

Taking all that into account, I've only had to pay $2.06 most of it being the storage I use for the site. In comparison, I was previously hosting my website through my domain provider in a shared server where it cost about $60!

Azure portal billing cost

If you're interested in how this site is built, check out the colophon.


Twitter is testing a new feature called Twitter Circles where you can restrict the reach of your tweets to up to 150 people. I remember the days of MySpace where many relationships were made / broken as a result of being in someone's Top 8 Friends list πŸ˜†


Hello Mastodon from lqdev.me. Got POSSE (Publish (on your) Own Site, Syndicate Elsewhere) working for Mastodon! I already had an Azure Logic App which took the latest post from my RSS feed and published to Twitter. Now it's also enabled for Mastodon. Might do a writeup later in the blog on how to do it.


Yesterday I ran into more feed goodness. When going to a website to get the weather forecast, weather alerts are displayed (if there are any).

I thought, wouldn't it be great to be able to subscribe to those alerts without having to go to the website?

A quick search revealed the National Weather Service provides these alerts in ATOM format for each of the states in the U.S. You have the option of tracking updates at the national, state, zone, and county level.

Subscribing is as easy as adding the link for the respective feed to your favorite feed reader. If you're interested, you can find the list of feeds at https://alerts.weather.gov/


In light of recent developments on Twitter, I see more people looking for alternative platforms. Personally, I don't think anything beats the level of customization and control you get from using your own website and RSS for sharing ideas and publishing content. However, it's great to see so many new friends in the Fediverse on platforms like Mastodon. I won't go into what the Fediverse is but you can check out the Fediverse wiki for more info. In many ways the Fediverse will feel familiar, yet foreign. Many people have shared tips on getting started, but I'll share some of the ones I've personally used to organically curate my feed.

  • Use hashtags. Similar to Twitter, hashtags are a good way of finding people and content related to the topic of interest.
  • Follow feditips and FediFollows. As the names suggest, they provide good tips and suggest interesting accounts to follow in the Fediverse.

As an alternative if you're not ready to create an account, you can use RSS. I'm a big fan of RSS. Appending .rss to any account on platforms like Mastodon (for example https://mastodon.online/@FediFollows.rss) provides you with the RSS feed of an account's posts letting you subscribe to their content through your favorite RSS feed reader.

Feel free to say hi at toot.lqdev.tech/@lqdev. See you in the Fediverse!


Here's a few links I found in the interwebs today that I thought were worth sharing.

Ubuntu 22.04 LTS Released

For my personal computing needs, Manjaro has been my default Linux distro. However, for hosting and cloud VMs I go back and forth between Debian and Ubuntu depending on which better supports the software I'm running. Canonical announced the release of the latest Ubuntu LTS version 22.04. Check out their blog post for more details on what's new.

Self-hosting Bitwarden on DigitalOcean

When it comes to password managers, other than KeePass, my favorite it Bitwarden. It's open-source, built on .NET, has excellent features that get even better with the low-cost paid version, but most importantly provides you the option of self-hosting. If you're looking to self-host Bitwarden on DigitalOcean, make sure to check out the guide they just published.

How DALLE-2 Actually Works

A few weeks ago, OpenAI announced the release of DALLE-2. Open AI describes DALLE-2 as, "...a new AI system that can create realistic images and art from a description in natural language." If you want to understand the details of how it works, you could read the 27 page research paper Hierarchical Text-Conditional Image Generation with CLIP Latents. AssemblyAI just published an approachable guide into the inner workings of DALLE-2. If you're interested in that, check out their blog post on the topic.

Emacs Configuration Generator

In my early days of using Emacs, I often found it difficult to customize it to my needs. Not only did I not know what was possible, but it also caused some anxiety because I didn't feel confident enough making major tweaks. If you're getting started with Emacs and aren't sure how to configure it, this neat utility helps you generate a config file for Emacs.


If you're building machine learning applications, one of the main things you'll need is data. This data can reside in databases such as MySQL, SQL Server, and Postgres. Algorithms are applied to this data downstream using libraries written in languages like Python, R, and .NET. This not only means potentially moving the data from the database to another process but also using another library to train models. According to the project description on GitHub, "PostgresML is an end-to-end machine learning system. It enables you to train models and make online predictions using only SQL, without your data ever leaving your favorite database." If that's something you're interested in, check out the project on GitHub.


About 2 episodes into the show Archive 81 on Netflix and so far so good. I listened to a few episodes of the podcast (by the same name) it's based on a few years ago. Now it makes me want to go back and listen to it again. It also makes me think about what the Black Tapes could've been had it gotten a TV or movie adaptation. If you're into the found footage, cult horror type of stories (think Blair Witch + Midsommar + Rosemary's Baby), definitely check it out.


Vertigo is one of my all time favorite movies, so I was excited to learn about Vertigo AI. Vertigo AI is an experiment that trained a model on footage from the film Vertigo to produce its own original short film. There aren't many details on the AI aspects of it, but I suspect there's a Generative Adversarial Network (GAN) similar to Norman behind it. Check out the generated movie.

Vertigo AI


That was easy. Site's been updated to .NET 6 and all it took was a one-line change.

GitHub PR diff .NET 6 update


I got a reminder on the importance of backups last night. New Jami username is lqdev1. Fortunately I wasn't as active on there so not a big deal to lose my original username.


I recently came accross the Terms of Service, Didn't Read (TOSDR) website. Like most people, I'm guilty of clicking "I have read and agree to the terms of service" without actually doing so. TOSDR simplifies terms of service documents by highlighting the main points for many popular services. I'm not sure how maintenance and the frequency of that maintenance work. There's a rating systems which I also am not sure how it works. However, you are provided with the documents that were used to come up with the ratings and summaries, so you can look into it yourself. Overall, I think it's an incredibly useful tool to make sense of how your data is managed across these platforms.

Duck Duck Go and YouTube TOSDR service comparisons


Found these while cleaning out some drawers 😒. Good times.

Two Nokia Lumia Windows Phones


I've written before about how versatile the RSS protocol is for publishing and consuming content. Podcasts are one of the many media types that largely rely on RSS, making it easy to subscribe and manage using an RSS client of your choice. Like RSS, VLC is versatile in the type of media it can play. When you combine the two, you're able to stay updated on the latest episodes from your favorite podcasts using the RSS reader of your choice and use the episode's URL to stream the audio in VLC.

.NET Rocks podcast RSS feed and VLC playing


Lately I've been using github.dev to author posts and make light edits on this site. One use case where it's come extremely handy has been authoring content, specifically these microblog-like posts from my Surface Duo. Instead of opening the laptop, I can open up my Surface Duo and visit my site's repo through github.dev. When I orient the device in landscape mode I have one screen display the VS Code-like web environment while the other screen displays the keyboard. Snippets, version control, extensions, and themes all carry over making the experience consistent. Committing changes through the version control utilities immediately kicks off the GitHub Action to build and publish the site with the latest changes. Pairing the GitHub online editor environment with the Duo's dual-screen form factor showcases the productivity and convenience both of these products provide.

Microblogging on Surface Duo in github.dev


I've been looking at different alternatives to Google Analytics for website stats. I don't need anything fancy, just basic page counts. Eventually I settled on GoatCounter because it's lightweight, open-source, and privacy-friendly. You can either use their managed service or self-host. Deployment with Docker felt a bit much so I did a standalone deployment to a VM. The Linode StackScript is probably the easiest way to do it.


While browsing the interwebs, I ran into the site United States Early Radio History which contains events spanning different time periods. As an operator it's fascinating to learn about the history and see how radio continues to play a large role in our day-to-day.


While browsing the interwebs I came across the Web Neural Network API spec from the W3C Web Machine Learning Working Group. The abstract defines it as "a dedicated low-level API for neural network inference hardware acceleration.". Although there are already a few frameworks like ONNX & TensorFlow.js that allow you to inference in the browser, this spec looks interesting because it provides an abstraction that allows you to take advantage of hardware acceleration using the framework of your choice. As the explainer document mentions, "this architecture allows JavaScript frameworks to tap into cutting-edge machine learning innovations in the operating system and the hardware platform underneath it without being tied to platform-specific capabilities, bridging the gap between software and hardware through a hardware-agnostic abstraction layer.". It's just a working draft for now but I'm looking forward to how this develops.


The other day I was going down a Wikipedia rabbit hole and learned those click-baity "Top 10 ..." or "Worst 10 ..." articles apparently have a name, listicles. The more you know...


That was easier than I thought. Visiting lqdev.me now redirects to this site. Redirect isn't limited to the main page though. You can also access any other resources as well. For example, visiting lqdev.me/feed would redirect you to the main feed on this site.


Today I learned about canonical URLs. While looking into how I can syndicate content from this site, specifically longer form blog posts on other sites like dev.to, I came across canonical tags.

Basically they're a way of telling the internet, specifically search engines, what version of your content is the main copy or single-source of truth.

Currently I've configured my site so it's not indexed or crawled. However, for sites I don't own I don't have the same level of control. Check out the canonicalization and SEO best practices for canonical URLs articles from Moz if you're interested in learning more.


I have almost zero artistic skills, so when I came across NVIDIA Canvas I was extremely excited!

Essentially it's an application that takes simple brushstrokes as input and uses a Generative Adversarial Network (GAN) to turn them into realistic landscape photos.

I had known about GANs being used to generate faces (see This Person Does Not Exist) but it's one of the first times I've seen them used to enhance drawings.

Time to take my stick figures to the next level!


Came accross this event series via Chris Aldrich's microblog.

Here's Chris' original post

Every Monday, I sit down with a cup of coffee and learn something interesting about history through old manuscripts. Join our friends Dot Porter (@leoba) and the Schoenberg Institute for Manuscript Studies (@sims_mss) at Coffee with a Codex.

Basically it's a virtual meeting where experts go over manuscripts on a variety of topics. I'll probably check out next week's event.


It only took 6 hours but I was able to get it done with a little help.

Bender from Futurama happy dance


No Windows 11 upgrade for me. πŸ˜₯

PC doesn't meet windows 11 upgrade requirements message

Just because I can't doesn't mean you can't though. Go and check if you're eligible at Windows Update

Start > Settings > Update & Security > Windows Update.

See Manage updates in Windows for more details.

For alternative install options, check out the Download Windows 11 page.


Just jumping on the bandwagon. A few online services are down at the moment 😬

Web service outages on downdetector.com

Not to worry...the personal site is still chugging along 😌


Just read an interesting article publised in Technology Review called Digital gardens let you cultivate your own little bit of the internet.

In a sense, some of the updates to this site like the addition of the different feeds can be thought of as "cultivating" my own digital garden.


I would've been okay with The Many Saints of Newark being a 4 hour movie. I didn't want it to end. Now I have to rewatch The Sopranos.


For personal use, I tend to default to Skype for my video / audio call needs.

Great to see there are some long-term plans for the app and it'll be around for a while more.

The future of Skype: Fast, playful, delightful and buttery smooth​


Found this interesting video by Chackram & Riley Thompson. My favorite part starts around the 7:15 mark.

When Planets Mate Video by Chakram & Riley Thompson


Hello World! First post in the personal feed.