To the best of my memory Tweetbot has been my window into Twitter since pretty close to when it was released. I open Twitter’s web interface from time to time when I had to, but it has not been my primary way to use the service for well over a decade. There are multiple reasons I was fine with living as a second-class Twitter citizen. Primarily no ads, no algorithmic timeline, and syncing between mobile/desktop clients.
State of CSS 2022
A great overview of all the latest and upcoming CSS features. Presented by Adam Argyle at Google I/O ‘22.
Connecting to the PassKit API with Ruby
We were recently testing PassKit as a way of managing membership cards for giving societies. Passkit is very up-front that they are not a CRM and strongly suggest using their API for integrating with outside systems, or for editing pretty much any data. To kick the tires, we set up some very basic scripts to connect to the Passkit API.
My Challenge to the Web Performance Community
Philip Walton on the difficulties the webperf community faces when discussing web performance. Simple numbers don’t cut it. We need to provide context when discussing performance results.
What concerns me about this practice is that it glosses over a lot of important nuance, and it perpetuates the idea that synthetic or lab-based tools (like Lighthouse, WebPageTest, and many others) are genuine and precise assessments of a site’s actual, real-world performance—rather than what they are: tools to test, debug, diagnose, optimize, and predict performance or detect regressions under a set of controlled conditions.
I’m definitely guilty of the simplicity he discusses. Thanks for the challenge Philip.
RSS and XSL Content-Type Requirement
We’ve provided Atom/RSS feeds for News and Events in our custom CMS at Notre Dame for well over a decade. However, if a visitor ended up on the url they were greeted with an unhelpful screen of XML. I decided to remedy this by appying some XSLT and styles to improve the user experience. However, even after reviewing several tutorials, I couldn’t get it to work.
Higher ed's slow page speed epidemic
Joel Goodman of Bravery Media on the current state of HigherEd homepages.
Regardless, it’s an agency’s responsibility to do as much as possible to make that website a success when it goes live. Do no harm. Slow websites only do harm. Code needs to be optimized, frameworks need to be ditched, images need to be properly sized and deferred, CSS and JavaScript need to be used with efficiency in mind.
Indeed
The 2020 Web Almanac
The HTTP Archive released the 2020 version of their Web Almanac based on data from 7.5 million websites.
Our mission is to combine the raw stats and trends of the HTTP Archive with the expertise of the web community. The Web Almanac is a comprehensive report on the state of the web, backed by real data and trusted web experts. It is comprised of 22 chapters spanning aspects of page content, user experience, publishing, and distribution.
If you’re in to web performance, they have sections on both Performance and Page Weight
HighEdWeb 2020
My presentation for HighEdWeb 2020 was about the basics of website performance evaluation. We covered tools such as Lighthouse, and WebPageTest. We also updated an example site to improve performance.
Thirteen Years
I’ve had five jobs since university. Of the previous four, my longest stint was four years. Today marks thirteen at Notre Dame. That’s the great thing about working for a university. After thirteen years, I’m still not bored, and there’s still so much to do.
Bad Assumptions
There are common assumptions web creators make. I’m guilty of them myself. One common assumption is that if the visitor is on a large screen then they must have a pretty decent connection, and as a result are sent large images and (often) auto-playing background videos.
This is a very bad assumption.