To clean up local Git branches that no longer exist on the upstream repository, you can run these commands.
To start, make sure your working copy is up to date.
git fetch --prune
The --prune
option removes and remote tracking references that no longer exist on the remote
repository.
By running
git branch
You get a list of all the branches on your working copy. Using the --merged
flag, filters that
list to show only those branches that are already merged into the main
or master
branch.
git branch --merged
To delete all the merged branches from your working copy, run this command.
git branch --merged | egrep -v "(^\*|master|main)" | xargs git branch -d
The list of merged branches is piped to an egrep
command that eliminates the master
or main
branch, as we don’t want to delete those. The remaining branch names are pipes to the git branch -d
command, which deletes them.
Whenever I find a command or set of commands online, that purport to accomplish some task, I always break the command down, and execute each stage of it, to make sure I understand what it is doing, and to ensure that it does what the author claims. Trust, but verify.
MacOS has a number of network interfaces making the output from ifconfig
messy and not easily
visually parsed.
A non-exhaustive list of network interfaces you might see on MacOS includes
ap1
- Access Point. Used if you are using your Mac as a wireless access point where you are
sharing its connection.awsl0
- Apple Wireless Direct Link. WiFi P2P link for AirDrop, Airplay, etc. Also used for
Bluetooth.llw0
- Low latency WLAN interface. Used by the Skywalk system.utnun#
- Tunneling interface. Use for VPN connections. MacOS seemingly hangs on to multiple
utun
interfaces, even after they aren’t in use.lo0
- Loopback or localhostgif0
- Software Network Interfacestf0
- 6to4 tunnel interfaceen0
- Ethernet interfaceen1
- Wireless interfaceen5
- iBridge / Apple T2 Controlleren6
- Bluetooth PANen8
- iPhone USBen9
- VM network interfaceen10
- iPadbridge0
- Thunderbolt BridgeA simple way to see the current list of IP addresses your Mac has is by using an alias like this.
alias inet='ifconfig | grep inet | grep -v inet6'
Which produces output similar to this.
❯ inet inet 127.0.0.1 netmask 0xff000000
inet 192.168.6.39 netmask 0xfffffc00 broadcast 192.168.7.255
inet 192.168.65.1 netmask 0xffffff00 broadcast 192.168.65.255
inet 100.101.18.109 --> 100.101.18.109 netmask 0xffffffff
While that can be useful, it would be nicer to know which interface had which IP address.
This command will display the name of the network interface and the assigned IP address for the active network interfaces.
ip -4 addr show | awk '/inet/ {print $NF, $2}' | column -t
In order for this command to work, the iproute2mac
formula via Homebrew.
The ip -4 addr show
displays all the network interfaces having an IPv4 address.
❯ ip -4 addr show
lo0: flags=8049<UP,LOOPBACK,RUNNING,MULTICAST> mtu 16384
inet 127.0.0.1/8 lo0
en0: flags=8863<UP,BROADCAST,SMART,RUNNING,SIMPLEX,MULTICAST> mtu 1500
ether 74:a6:cd:b6:eb:7f
inet 192.168.6.39/22 brd 192.168.7.255 en0
utun8: flags=8051<UP,POINTOPOINT,RUNNING,MULTICAST> mtu 1280
inet 100.101.18.109 --> 100.101.18.109/32 utun8
bridge100: flags=8a63<UP,BROADCAST,SMART,RUNNING,ALLMULTI,SIMPLEX,MULTICAST> mtu 1500
ether 76:a6:cd:6b:c1:64
inet 192.168.65.1/24 brd 192.168.65.255 bridge100
The awk
statement filters for the line containing inet
and then prints the last field from that
line ($NF
) and the second field ($2
). The last field is the interface name and the second field
is the assigned IP address.
The column -t
command formats the output into columns.
❯ inet
lo0 127.0.0.1/8
en0 192.168.6.39/22
utun8 100.101.18.109
bridge100 192.168.65.1/24
Today I learned that you can paste and match surrounding indentation at once. After selecting the
line or block of lines to be pasted, use ]p
instead of only p
. Et voila.
I am embarrassed to think of the hundreds, thousands of copy-paste operations I’ve done that were immediately followed by selecting the newly pasted block and fixing its indentation. I need the read the friendly manual more.
Enough years ago that I no longer remember when I discovered this trick, I added the following
clause to my .ssh/config
file.
# Access GitHub even when port 22 isn't available
Host github.com
HostName ssh.github.com
Port 443
user git
In a nutshell, this allows you to access GitHub over port 443, instead of the usual port 22. Various places block port 22 traffic; having this in your configuration file sidesteps that problem.
Until last week this worked perfectly. Then it stopped working, but only on my work laptop. Not on my personal laptop, not on the Linux admin workstation running in AWS, not from any of my collection of Raspberry Pies or the Intel NUC that is my IRC and mutt host, only from my work laptop.
Thanks to a GitHub repository, all my configuration files are shared between the various computers I
use. The same .ssh/config
worked everywhere but my work computer.
Recently my employer suffered an intrusion–we were hacked. Security has been tightened considerably.
Including not allowing ssh
traffic over ports that aren’t :22
. As soon as I commented out the
GitHub clause, all my git
commands started working again.
Some searching and reading of the ssh
man
page and a little experimenting allowed me to create
this slightly altered clause.
# Access GitHub even when port 22 isn't available
# Host github.com
Match user !<user> host "github.com"
HostName ssh.github.com
Port 443
user git
The Match
keyword allows you to place conditions on the directives that follow. The user
and
host
keywords (must be lowercase) let me place a guard around the GitHub port 443 setup. If I am
signed in as my work id, then the clause is skipped, meaning a normal ssh connection over port 22.
For any other user the clause is used, to access GitHub. There are several other keywords that
Match
can operate against, allowing you to create some sophisticated conditional restrictions in
your .ssh/confg
file.
For a long time Apple Music (nee iTunes) offered an option called “Genius Playlist”. You selected a track from your collection, and Apple generated a playlist of 25 titles that were somehow related to the seed track. I had 15 or 20 of these. It was my own personal radio station. One without ads or an annoying DJ talking over the intro or outro of a song.
I’m not sure when Genius Playlists were eliminated from Music, but I miss that option. So I set out to make my own. Once upon a time fifteen or more years ago, I remember reading an article where someone created a set of smart playlists that categorized tracks, and then a master playlist that used those category lists as its input.
What I have create are four category smart playlists:
I also scrolled through the list of tracks and ticked the “favorite” star on several hundred tracks that I particularly like.
Never Played: Tracks that are favorites, and have a play count of zero.
Not Played in 90 Days: Tracks that are favorites, and that haven’t been played in the last 90 days, but have a non-zero play count.
Not Recently Played: Tracks that are favorites, that have been played less than 90 days ago, but more than 15 days ago.
Recently Played: Tracks that are favorites, that have been played in the past 14 days.
Each of these category lists is limited to 50 tracks are are selected randomly.
The master list draws from all four category lists. Where the category lists AND all the rules together-Favorite AND never played-the master list ORs the rules-Playlist is Never Played OR Playlist is Not Played in 90 Days OR ….
The master list is also set to only 25 tracks, randomly selected.
I understand that the Never Played playlist will eventually run dry. And I’m not completely satisfied with using date ranges for the rest of the lists. What would work better, but requires considerable work, would be to rate tracks from 1 to 5 stars, and build lists around ratings and play frequency. A long term project.
I’ve been using the Neovim nightly build for some time. The way I have been accomplishing this is to update my local clone of the Neovim repository, and then make and install the application. This works, but it does take a little time.
With asdf
and the neovim
plugin I can easily update to the latest nightly version.
To add the neovim
plugin to asdf
asdf plugin add neovim
To install the nightly build of neovim
asdf install neovim nightly
To make the nightly version the global default
asdf global neovim nightly
In order to update the nightly version you need to remove the old nightly version first, so I have this bash alias setup.
alias update-nvim='asdf uninstall neovim nightly && asdf install neovim nightly'
And, since asdf
will let me have multiple version of Neovim installed, I could have the stable
version on hand, and use it for a project if I wanted to.
After reading Thorsten Ball’s Register Spill newsletter about New Year, new job, new machine I decided to give asdf a try. It’s a single piece of software designed to manage multiple versions of any number of other pieces of software. Like rbenv
or rvm
is to Ruby, asdf
is to Ruby, and Python, and NodeJS, and, and, and.
Here’s how I set it up.
Clone the GitHub repository. Installing it via Homebrew apparently has
some issues. Running this command will close the 0.13.1
version into the .asdf
directory in your
$HOME
.
git clone https://github.com/asdf-vm/asdf.git ~/.asdf --branch v0.13.1
Install some plugins to manage the software of your choice. You can get a complete list of plugins available by running:
asdf plugin-list-all
I actually piped the output through grep
to make finding the software I wanted a bit quicker.
asdf plugin-list-all | grep ruby
With the plugin name and repository information, run:
asdf plugin install ruby
Determine the version, or versions, of the software you want.
asdf list all ruby
Install the software.
asdf install ruby latest
or
asdf install ruby 3.3.0
Set the version globally. (It can be overridden on a project by project basis.)
asdf global ruby latest
There is no Step Six.
Using asdf
means I have one set of software version management commands to remember, and one
location where that information, and those versions, are kept. Better still, the version information
can be shared, say with your team, ensuring everyone has the same versions of required tools
installed.
Here are five bash aliases that I find useful.
alias 'dus=du -sckx * | sort -nr'
Here is a breakdown of the command.
du
– command estimates file system usage-s
– creates a total for each argument-c
– creates a grand total-k
– sets the block size to kilobytes. Using -m
would set it to megabytes-x
– skips directories on other file systemsThe output of the du -sckx *
command is then piped to sort -nr
.
sort
– sorts lines-n
– does a numerical sort-r
– reverses the result, so it is in descending orderalias inet='ifconfig | grep inet | grep -v inet6'
Here is a breakdown of the command.
ifconfig
– lists the current network interface configuration| grep inet
– returns only the lines with inet
| grep -v inet6
– eliminate those lines with inet6
, i.e., IPv6Depending on your OS you may need to use ip a
instead of ifconfig
.
alias latr='ls -latr'
Here is a breakdown of the command.
ls
– lists directory contents-l
– use the long format-a
– do not ignore hidden files-t
– sort by modification time, newest first-r
– reverse the sort orderThe mnemonic I use for this alias is “later”.
alias 'define=curl dict://dict.org/define:"$@"'
Here is a breakdown of the comment.
curl
– transfers a URL, in this case dict://dict/org/define
"$@"
– the search term providedalias makepass='openssl rand -base64 15'
Here is a breakdown of the command.
openssl
– runs the openssl
command line toolrand
– generates pseudo-random bytes-base64
– use base6415
– make the resulting password 15 characters long
In December 2005 I purchased a license for Mint, a website visitor tracking tool. Written in PHP and utilizing JavaScript, it was an elegant approach to understanding what kind of traffic your website was getting. Mint hasn’t been developed in ten years, and is no longer supported. I have had to patch the source code a couple of times in recent years to keep it functioning. Mint showed you visit (by hour, day, week, month, or year), session information, referrers, pages viewed, and information about the browser/platform used by the visitor. The site layout was beautifully constructed and a joy to use. Mint also allowed for plugins, called Peppers. There was at one time a fairly active set of Peppers you could add to gain further insight into your site’s visitors.
Both my wife and I have used and continue to use Mint. Largely since the current crop of visit tracking services or products are all aimed at competing with Google Analytics. They are complex and have noisy interfaces, and generally aren’t useful for our purposes. I have been looking for a new site tracking tool for years, but haven’t found one I liked. That is until I happened on to Umami.
The interface is simple, and, as my wife put it, “highly clickable”. It doesn’t try to be the next Google Analytics, but it does provide lots of useful information for owners of small websites. Visit counts, pages viewed, visit duration, and information about their location and browsers. Even better it is respectful of data privacy, and meets EU GCPR requirements. There is a cloud service and a self-hosted option.
Signing up for the cloud service was quick an easy. Once you add a site a tracking code is generated
that you add to the HTML <head>
section of your page. For my site, which is statically generated
using Hugo adding the tracking code was simple. Adding the code to
WorkPress for my wife’s site was a bit more complicated.
Her site uses a free version of a highly customize-able theme, which makes adding code, ah, tricky.
There is a plugin called “Integrate Umami” that purports to insert the tracking code into your
WordPress site, but I was unable to get it to work. Instead I found another plugin called
WPCode that lets you, among
other things, insert code into the <head>
section of your site. That solved the problem of getting
the tracking code for my wife’s site in place.
We’ve only had Umami for two days now, so it is too early to tell how satisfied we’ll be in the long term, but our initial reaction is positive. For now I’m leaving our creaky Mint infrastructure in place. It’ll be interesting to compare the numbers between Mint and Umami. Umami also provides some API documentation, it might be interesting to see if I can import at least some of the history from Mint into Umami.
For the past 3 (4?) weeks my iPhone 14 Pro has been randomly rebooting itself. Once a day, maybe once a week. Annoying but not the end of the world. Yesterday morning the phone rebooted over and over. After each restart it work for a minute or two and then reboot again. After a quick search I learned that you can force stop an iPhone by quickly pressing and then releasing the up volume button, then the down volume button, and finally holding the power button until the Apple logo appears.
Doing that got it out of the endless reboot cycle. When I restarted it again it started and appeared to be working. I tried to make a backup to my laptop, but it rebooted in the middle of that process. More searching suggested that upgrading the operating system was something to try. The 17.1 release was out, but I hadn’t installed it yet. It took maybe 4 or 5 more reboots before I managed to get the upgrade to complete.
After the upgrade it seemed more stable. Still, I initiated a support chat with Apple. After collecting some diagnostics remotely, the support representative said that I needed to bring the phone in for repair. She said there was “stability issues”.
I did complete an iCloud backup, and for the rest of the day, my phone mostly worked. It only rebooted one more time (that I was aware of).
This morning it restarted and then it came up with the restore screen. This is an image of a laptop and a lightning cable. You need to plug the phone in. Doing that presented me with two options: restore or update. Restoring would keep the previous settings, updating wipes the phone and starts over fresh.
The restore failed twice in a row. I tried the update once, it also failed. I started another support chat, wanting to see how (or if) I could restore my phone to my iPhone 6S Plus. Between the 6S and the 14 Pro I had a 12 Pro. That phone was dropped and had a small chip in the edge of the screen. I traded it in last September for the 14 Pro, so the only device I could go back to was the 6S Plus.
As it turns out there are two reasons why that couldn’t work. One, you have to restore to the same OS version, and the 6S tops out at iOS 15. A long way away from 17.1. Second, you need at least as much storage. My 14 Pro had just over 130 GB in use. Far more that the 6S could ever hold.
I was able to make a Genius Bar appointment for the middle of the afternoon. The nearest Apple store is 125 miles away in Kansas City. After a quick shower and lunch my wife and I drove to Kansas City to the Apple Store.
The people there were very helpful, and frank. After reviewing the chat records the woman, whose name escapes me, said that the “stability” issue were an indication that the main board had failed. The only way this phone would work again would be to replace that board. It costs about $650.
The other option was to buy a new iPhone. Using the Apple Card I could make monthly payments, so $55-ish dollars a month instead of $650 all at once. In the end I decided to get the new phone.
After waiting about 20 minutes for the pick-up-in-the-store order to arrive after buying it online, I went through the setup and restore process in the store. The phone had to install the 17.1 update first, which took a good 30 minutes, the restore from iCloud started. Transferring the phone number from the old phone to the new phone was not straight-forward.
Since my old phone was inoperable, there was no way to confirm the switch using that device. So I had to sign into AT&T and initiate the transfer there. Fortunately I knew the sign in account and password. The representative at the setup table said most people didn’t know how to access their account that way. I had to have it send a code to my wife’s phone (also on the same account) and then have her forward that code to me. Having my laptop with me proved to be useful. I was able to use Messages on the Mac to chat with her.
One surprise silver lining was that my Apple Watch synced with the new iPhone. I didn’t even lose today’s activity rings. Having closed all three activity rings every day for the past 8 years, I wasn’t happy about the possibility that I’d lose my streak over a dead phone.
Once the number was activated, and the restore was progressing I was able to leave the store with a working phone. My iPhone 14 Pro case sorta fits on the 15 Pro. It fits well enough to protect it until a new case arrives on Tuesday. I also ordered a screen protector.
I did spring for Apple Care, for the first time ever. Had the 14 Pro been covered, I would have gotten a new phone at no cost to me. Even if the 14 Pro was damaged, I’d only have to pay a nomimal $29 to repair the screen, and then get a new phone. I have had iPhones since the 4 came out, and have never had one die like this. The iPhone 12 Pro was the first one that ever got damaged.
I am grateful that we were able to travel for 4+ hours today to get to an Apple Store. And that I can afford another $50 a month for a new phone—while still paying off the last 11 months of the new iPhone 14 Pro shaped paper weight I have. I am grateful that the Apple people, both on chat and in person, were friendly, helpful, and empathetic.