Home Automation, Hubs, Protocols, and What I Wish I Knew When Starting Out


Embarking on the journey of home automation can feel like stepping into a realm filled with endless possibilities and, admittedly, a bit of confusion. When I first dived into this world, the array of protocols – Z-Wave, Matter, Zigbee, and more – felt overwhelming. Through trial, error, and a lot of learning, I’ve found a setup that works seamlessly for me, centered around Zigbee and Home Assistant. Here’s a breakdown of what I wish I knew when starting out, and how different protocols differ and operate.

Understanding the Protocols

Z-Wave: Operates on a low-frequency, mesh network that excels in reliability and range. Its strength lies in its ability to build a robust network where devices can relay information to one another, making it ideal for larger homes. However, Z-Wave operates on different frequencies in different countries, which can complicate international device compatibility.

Matter: The new kid on the block, aiming to unify smart home devices under a single, open-source standard. Matter promises to be the bridge that finally allows devices from different ecosystems to work together seamlessly, irrespective of the manufacturer. It operates over Ethernet, Wi-Fi, and Thread (a low-power mesh networking protocol), offering flexibility and ease of integration.

Zigbee: Like Z-Wave, Zigbee is a mesh network protocol, but it operates on the 2.4 GHz frequency, which is the same as Wi-Fi and Bluetooth. This can lead to interference in crowded wireless environments but also allows for a broader range of device compatibility worldwide. Zigbee’s open-source nature has led to widespread adoption by a variety of manufacturers, making it a versatile choice for home automation.

My Journey to Zigbee and Home Assistant

After experimenting with various protocols, I settled on Zigbee for its balance of range, reliability, and device compatibility. The turning point was incorporating a Zigbee dongle into my setup, allowing me to intercept signals and translate them into instructions for Home Assistant – a powerful, open-source home automation platform.

This combination opened up a new world of possibilities. Home Assistant’s expansive support for devices across different protocols (including Z-Wave and Matter, through respective integrations) meant I wasn’t locked into one ecosystem. Yet, by using Zigbee as my primary protocol, I could leverage its mesh network capabilities and extensive device support without being overwhelmed by interference issues.

I’ve got a conbee dongle on a USB extension cord (needed to reduce interference from being stuck in the back of a computer). This is picking up all the zigbee signals in the house without fail.

I’ve got myself a little Intel NUC on which I host Home Assistant and several other docker containers. The network is segmented so the primary LAN is separated from the IOT devices and I’ve blocked the ability for any of those devices to talk to any external services.

System’s up about two years now and running very smoothly.

Expanding the Horizon with Devices and Automations

With Zigbee and Home Assistant at the core of my home automation system, I’ve been able to integrate a wide range of devices:

  • Lighting: Smart bulbs, LED strips, and light switches that can be programmed to adjust based on time of day, presence, or even sync with entertainment systems for immersive experiences.
  • Security: Door/window sensors, motion detectors, and smart locks that enhance home security and provide peace of mind through real-time alerts and automations.
  • Climate Control: Smart thermostats and radiator valves that optimise energy usage and ensure comfort by adjusting to my schedule and preferences.
  • Multimedia: Integration with media servers and smart speakers for a seamless entertainment experience throughout the home.

The true magic, however, lies in the automations. Home Assistant allows for the creation of complex scenarios, such as turning on the lights and heating when I’m nearing home, or setting the perfect movie-watching ambiance with a single command. The possibilities are limited only by imagination.

I now have a single Zigbee dongle, no device hubs (other than a single unit for blinds). Philips Hue hub is gone, SmartThings hub is in a tub in the garage. I’ve also got rid of the horrible Arlo camera system and the on-going subscription to get access to your own video footage.

Concluding Thoughts

Looking back, I wish I had a clearer understanding of the nuances of each protocol and how they fit into the broader ecosystem of home automation. Settling on Zigbee and leveraging Home Assistant’s versatility has allowed me to craft a smart home system that’s both powerful and tailored to my needs. For anyone embarking on their home automation journey, my advice is to consider not just the devices you want to integrate today, but the system’s overall flexibility and how it can grow with your needs. Happy automating!

Next steps – have a look at Shelly devices (generally using Wifi) and see what I can break.

AI and potential impact on IT Architecture

As an IT solution architect, the impact of AI, both generative and traditional, on my role is expected to be profound and multifaceted over the next few years. Here a list of the ways I think AI is likely to influence responsibilities and the broader IT architecture landscape:

1. Enhanced Decision-Making with Predictive Analytics

AI can significantly augment decision-making processes by providing advanced predictive analytics and data insights. This gives us the ability to leverage AI to try to predict future trends in technology and broader business functions. This foresight could be invaluable in planning and implementing technology systems that are future-proof and scalable.

2. Automated System Design and Optimisation

AI technologies can automate many aspects of system design, including network configuration, infrastructure setup, and even software deployment. Depending on the hands-on nature of the organisation, this could shift the architecture role from being heavily involved in the manual setup of systems to overseeing AI-driven automation processes, ensuring they align with organisational goals and standards.

3. Security and Compliance

With the increasing sophistication of cyber threats, AI can play a critical role in security and compliance. AI-driven security solutions can predict, detect, and respond to threats faster than traditional methods. If you’re involved in the IT-Sec space, you may see your role evolve to include the management of these AI security systems, ensuring they are integrated seamlessly with the existing tech stack/SIEM systems etc.

4. Customised Solutions

Generative AI can design customised solutions based on specific corporate needs and adapt over time as those needs change. Your role might increasingly involve specifying parameters and goals for AI systems that then generate and iterate on technology solutions autonomously. Build a solution with components A,B and C today, watch it change over time and replace C with X and Y as growth takes hold, or alternatively scales back/.

5. Governance and Ethical Oversight

As AI systems become more integral to organisational operations, chances are the architecture role will likely expand to include governance and ethical oversight of AI use. This includes ensuring AI systems are transparent, explainable, and aligned with ethical standards, especially in sensitive areas like data privacy and bias mitigation.

6. Continuous Learning and Adaptation

The rapid evolution of AI technologies means that continuous learning will become a crucial aspect of the IT architecture role. Keeping up to date with the latest AI advancements (the potential run-away train), understanding their implications for IT architecture, and adapting strategies accordingly will be essential for maintaining job competitiveness and innovation.

7. Collaboration and Communication

AI will necessitate closer collaboration between IT, business units, and external stakeholders. You’ll need to communicate complex AI concepts in understandable terms, facilitating alignment between technology capabilities and business objectives. There’s a gap forming between those architects who’ve been focusing on the tools pf the trade from circa 2010s onwards and those who’ve been able to invest the time and effort to at least keep up with AI topics and agendas.

8. AI in Development and Operations (AIOps)

The integration of AI into IT operations (AIOps) is set to increase efficiency in monitoring, management, and deployment of IT resources. This could transform how IT infrastructure is managed, moving towards more proactive and predictive management models. We’ve essentially become plumbers of the tech, not builders.

9. Innovation and Competitive Advantage

Will our business partners be able to look to technology partners for guidance on driving innovation and competitive advantage through AI? Will they defer to the age-old approach of using their own budgets to buy in commoditised SaaS offerings before asking us to integrate their toys back into the org? Architect’s task of identifying opportunities for leveraging AI to create new products, services, or business models could become a key part of our responsibilities so we can bring innovation to the business and not the other way around.

The impact of AI on the role of IT architects is expected to be transformative, requiring a blend of technical acumen, strategic foresight, ethical consideration, and continuous learning. Embracing these changes will not only enhance our ability to contribute to business needs but also position us as key players in shaping its future technology landscape.

Background Check Network Status

Because I live in the land down under where we’re serviced by a broadband network system that’s been screwed over by a number of successive Governments, we can never be guaranteed a consistent service. I wrote the following script to run in the background on my Apple Mac to notify me of any changes to the network status. It requires nc to run…but you can change this to any command that does an external connection test such as a simple ping. It does a little sleep at the end for 10 seconds. Obviously adjust to suit your circumstances.



while [ 1 ]
nc -z 53 >/dev/null 2>&1
if [ $? -eq 0 ]  ; then
  if [ "$INTERNET_STATUS" = "DOWN" ] || [ "$INTERNET_STATUS" = "UNKNOWN" ] ; then
      # if on a mac you can have siri speak to you...
      # using the say command
      # otherwise change the say command to echo
      # or in my case, have a notification pop up to let you know when offline
      /usr/bin/osascript -e "display notification "The network is up" with title "Network Status""
  if [ "$INTERNET_STATUS" = "UP" ]; then
      /usr/bin/osascript -e "display notification "The network is down" with title "Network Status""
sleep 10

It runs in the background and can be killed by simply identifying the processing and issuing a kill command. For example I run it as a file called check_network.sh with chmod +x applied.

To start it simply type


To kill it, look for it’s pid and kill issue the kill command

killing the running background process

The Notes Dilemma


I’m predominantly an Apple system user (iPad, Mac Mini Desktop, MacBook Pro, iPhone).

I’ve been using Evernote for general note taking and document archiving since about 2012. Overall it’s been pretty good with a few integration features that really expanded its’ use case. Examples are IFTT and Gmail integration, the scannable app and the Skitch screen-grab app. The web clipper is also incredibly useful for storing your own copy of web content that may or may not be around for longer (e.g. news snippets, stock reviews).

I use google mail as a mail provider and have set up a number of rules to forward incoming emails to Evernote where I sort them into relevant folders on a monthly basis – Accounts/Income/Expenses/Receipts.

I use IFTTT (with varying degrees of success) to forward attachments from Gmail to Dropbox..although limited success over time as a result of inconsistency in billing providers.

I used to pay a subscription to Dropbox but stopped when they moved away from supporting encrypted Linux folders. I still have an account but it’s the free version used mainly for emergency transfers between myself and my family members.

Enter iPad 2018

I got a new iPad in late 2018 and have been using it as my main work tool since. I’ll log into the desktop or the laptop in the evenings or weekends, but predominantly use the iPad during the work day.

The problem (and the reason for this post) is the crappiness of the Evernote iPad app with no sign of a resolution on the horizon. The BIG BUG is where you’re in the middle of reading or writing a note (in the middle of a conference or webinar), you switch away from the note to another app, then back to the note, and guess what the note has scrolled to the top of the page. You then have to scroll down to the point you left off…which Is fine if that’s a scroll to the bottom. It’s a pain in the ass if it’s not. This is the same if you’re using a note to reference any online material, go back to the note and try to find where you were while reading…aaargh!!! This is friggin’ frustrating.

Another aspect of Evernote also causes a bit of grating is the lack of support for markdown syntax processing. Although not a biggie, it’s a limitation for cross application integration.

Trials and Tribulations

In attempting to overcome this situation and continue to take notes on the iPad, I tried a number of options. Here they are in chronological order.

Apple Notes

The default apple app. It’s simple, provides very basic formatting capability with tables and checklist being about the extent of ‘additional’ functions. It doesn’t integrate with any apps other than via the ‘share’ option.

It does provide a richer nesting hierarchy than notes. There’s no tagging function, although in theory, you an use a hack workaround through inserting your own #tags and searching for these as text.

I’ve uploaded a subsection of my Evernotes collection to Apple Notes and it seems to cope ok with the volume (700) with a few speed glitches along the way.

I’ve opted for all my notes to be iCloud based with none stored locally – this is so I can access everything I need on the iPad and minimise risk of a storage split.

My plan for 2020 is to keep trying with Apple Notes and see if I can overcome limitations by simplifying my own approach to note taking and storage.


Drafts is a strange one and after about 3 months I’m still forgetting where it fits into my workflow. It’s a basic note editor with a bundle of integration feature and a growing list of integrations with an obvious dedicated and active following.

Drafts is the clutter collector.

The core modus operandi for Drafts is where you’re not sure what’s going to happen to the note you’re currently working on. Is it a code-snippet that you won’t care about tomorrow? Is is a copy of text from a document you’re translating?

The general rule is:If it’s an ephemeral ‘note’, start it in drafts. If it moves from ephemeral to longer-lasting, complete it in drafts and push it for longer term storage to Evernote, or Apple Notes.

Drafts supports a massive range of integrations where you can take your note and send it as an email, as a Dropbox or google drive file. The list is exhaustive.

Drafts supports plain text, markdown and javascript formatting.

From a workflow perspective, I have to keep reminding myself to open Drafts and tend to go straight to notes, then realise I’m collecting a bundle of snippets throughout the day in my main notes application that I should go back and delete as they’re just going to cause clutter.

The Drafts iPad app is simply fantastic. Consistent, easy to use, fast, efficient and reliable. I really should make this my go-to for initial content creation.


I got drawn into notion as a result of 2 prominent youtube proponents (Ali Abdaal and Francesco D’Alessio of ‘Keep Productive’ fame).

What it is?

Its a personal organisation tool that allows you to create both rich text pages and flexible databases in an all in one life-organising app.

It uses the concepts of blocks of content – each block can be of a specific type (web links, files, text, table, headings etc.)

Notion does offer a flexible and very attractive layout for organising everything from events, projects, study plans, course notes etc. you name it, there’s no reason it can’t get stored in Notion….except for one problem.

Notion also has its own web clipping tool. Much like Evernote although not as flexible.

The iPad afterthought:

The Notion app on the iPad is an afterthought. I raised a few limitations with Notion support and got responses back that said ‘we’re aware and looking into it’. I’m not sure if the responses were canned and written by a bot.

Changing the order of blocks on a page is not supported. Importing from other sources, no way. The waste of real estate because of misaligned side padding was just terrible.

So, if you want to organise your world and you work on a desktop or a reasonable sized laptop, Notion is pretty excellent. If, on the other hand you spend most of your day with tablet in hand. Fuggetaboutit.


I’ve been using Gmail as my mail exchange service for years. It’s pretty effective with readers that work across platforms consistently well.

I have a number of filter rules set up to streamline the sending of attachments to Evernote.

I must admit to wanting to move away from Gmail for sometime as a result of the prying eyes of the Google ecosystem. The only reason I haven’t moved is because of the hassle factor as both my wife and I have our own domain based email addresses being serviced. Migration to another service is on my todo list.

Apple Mail

I run an Apple mail client on my mac mini which is always on. As you’ll see shortly, this has become a part of my proposed note/file management solution.

And the answer is…

Not simple!

Given each app has its strengths and weaknesses, I’ve decided to go hybrid.

  1. Apple Notes will replace Evernote as my main repository of Projects, Areas of interest, Resources and Archived info.
  2. I’ve set up rules in Apple mail to take incoming attachments and push them to a personal file storage server I use (not Dropbox).
  3. I’m making a call on moving away from storing large attachments (pdfs etc) as notes – whereas they’re really assets that I should link to from a notes application.
  4. Given the capability of Notion to provide a rich and flexible editing environment for ‘some’ material, I’ll be using that for managing some lists (podcasts, movies to watch, bookmarks etc).
  5. I’ll also use Notion as an organiser of training/tech education material – Ive found its ability to organise and deep link material very useful.
  6. I’m going to try and force myself to use drafts for its original purpose…yes to store ‘drafts’ of notes and only share with Apple Notes those notes I think need longer term storage – otherwise I can be diligent and delete the crap I collect in here on a regular basis.
  7. Apple Mail client – I’ve set up filter rules to examine incoming emails with attachments and send them on their merry way to a conceptual Inbox folder on my file server. This replaces the Evernote catch-all that was in place via Gmail filters.


  • Use of Apple Mail client – this comes with the caveat that if I ever move from the Apple Mail client or if that device is turned/off, I’ll have to set up the filter rules somewhere else.
  • Existing content is remaining in Evernote but flagging folders with a ‘README’ to remind me I’ve migrated content elsewhere. You never know if they fix the focus issue, I may return.
  • Keeping an eye on Apple Notes performance – I think because I’ve gone ‘All iCloud’, I may regret this decision.
  • Use of a file server for content – while perfectly acceptable for bills, invoices, receipts etc which I review at tax time, I’ need to be careful of other file content that doesn’t end up in a big information black hole for me to delete 5 years after collecting it. This is a self-discipline problem.
  • Remembering the split and sticking to it. If i start adding movies to watch to Notes, or Evernote instead of Notion, the divide is not working.
  • Below is my revised and simplified Notion home page. Must remember to keep it simple. The Work in Progress link is to any on-going study/learning work I happen to be doing.

  • Lack of an Apple Web Clipper equivalent may also prove a challenge – The Evernote clipper is really good – esp when clipping content of simplified articles. It’s unproductive to be clipping to Evernote, or Notion, then doing an export and a subsequent import into Apple Notes. Let’s see how this goes.


This web page was created by me writing in Drafts – using Markdown – then copied and pasted into Notion so I could generate it as a HTML for for uploading to my website – I then imported the HTML version into Apple Notes for safe keeping – will delete it in Notion in due course.

Calculating ip address ranges

A handy way of calculating how many IP addresses are available in a subnet is as follows:

Eg. 1 Given an IP subnet with the CIDR of, how many available IP addresses have you got?

Answer is to subtract the delimiter (16) from 32 and calculate 2 to the power of the remainder….so that’s 2^16 = 65536.

However beware that some cloud hosting environments pre-allocate addresses for their own use (AWS used first 4 and last 1) so you really only have 65531.

Eg 2. What’s the smallest subnet you can create for the CIDR of

Answer is because 32-32 = 0 => effectively no addresses are available.

Eg 3. How many addresses are available with the CIDR of

Answer is 32-24 = 8, 2^8 = 256. The available IP addresses are to

So – remember to subtract the delimiter (Mask bit) from 32 and raise 2 to the power of the result.

102 Linux Tips

Over 100 tips for beginner to intermediate Linux command line users. I’ll add to the list as and when I think of it. The tips start simple but get a bit more complex as things go along.

_001 Learning commands 101 use the ‘man’
To find out what commands and options are available for you

$ man

_002 cd
The cd command enables you to move between directories on your system. Like the following, for example:

$ cd /home/user/

_003 cp
The copy command, or cp, can be used to copy files from one location to another. To do this use the command below:

$ cp file /home/user/Desktop/file

_004 mv

Similar to copy, mv instead moves the file to the new location, deleting the original file:

$ mv file /home/user/Desktop/file

_005 rm

The rm command is for removing or deleting files and directories. You can use it like:

$ rm file

_006 mkdir
You can create directories with the mkdir command using something like:

$ mkdir folder

_007 nano

Nano is one of the programs that enables you to edit text in files – it’s vital for working and editing in the command line. You use it like so:

$ nano file

_008 Tab

The Tab key lets you auto-complete commands and file names. Double-tapping will list all objects with a similar name. Auto- completion works for unambiguous file and command names. For example, ‘fir’ gives you ‘firefox’ as no other commands begin with ‘fir’. If you get multiple hits, keep typing to narrow the selection, then hit Tab again

_009 Up

Up on the keyboard has the very simple task of enabling you to pull up the last command that was entered, to run it again or edit it.

_010 Copy text

If you’re in the terminal, you may want to copy text output to use somewhere. To do this, you can use:


_011 Paste text

If you’ve found a command online or need to paste some text into nano, you can enter any text on the clipboard with:


_012 Open terminal

This trick works in a lot of desktop environments: a shortcut for opening the terminal can be done with:


_013 sudo

This lets you do commands as the super user. The super user in this case is root and you just need to add sudo to the start of any command.

_014 Access root
The super user, or root, can also be accessed by logging in as it from the terminal by typing su. You can enter the root user password and then run every command as root without having to use sudo at all.

_015 Stop processes

If you have run a command and either it’s paused, taking too long to load, or it’s done its job and you don’t need to see the rest of it, you can stop it forcefully by using either Ctrl+C or Ctrl+Z. Remember, only do this if the process is not writing any data, as it is not safe in this instance and you could find yourself in trouble.

_016 Access root without password

If your regular user is in the sudoers list (ie so that they can use sudo in general to run things as root), you can access the su account by using sudo su. This uses your normal user password to access root terminal functions.

To add a regular user to the sudoers list, you would need to first log in as the super user by using su. Then, you would simply run adduser username sudo (replacing ‘username’). Be careful who you add to the sudoers list, though!

_017 Search for hidden files and directories

The ls command can be used in more than just the basic way of listing all the files in a folder. To begin with, you can use it to list all the hidden items along with the normal items by using:

$ ls -a

_018 Home directory shortcut

The home directory is located at /home/user/ in the absolute filesystem, but you can use the tilde (~) to signify the home directory when moving between directories or copying, or doing mostly anything else – like with the following, for example:

$ cd ~

_019 Link commands together

If you want to do a series of commands one after the other, like updating and then upgrading software in Debian, for example, you can use && to have a command run right after the one before. For example:

$ sudo apt-get update && sudo apt-get install libreoffice

_020 Long terminal input

Sometimes when using a list or anything else with a long terminal output, you might not be able to read it well. To make it easier to understand, you can send the output to another command, less, through the | pipe. It works like so:

$ ls | less

_021 Readable storage size

One of the details that ls -l displays is the size of the files that are located on the hard drive. This is done in bytes though, which is not always that useful. You can have it parse this file to become more legible simply by changing the -l to -lh, where the long-listing format option (-l) is tweaked with the human-readable option (-h).

_022 Move to previous directory

If you want to move back to the directory that you were working on before, there is a cd command to do that easily. This one does not move up the filesystem, but rather back to the last folder that you were in:

$ cd -

_023 Move up directory

The cd command can also be used to move up in the filesystem. Do this with two full stops, like so:

$ cd ..

_024 General wildcards

The asterisk (*) can be used as a wildcard in the terminal to stand for anything. A typical use case is copying or removing specific types of files. For example, if you want to remove all PNG files from a directory, you cd to it and type:

$ rm *.png

_025 More with the pipe

The pipe (|) can be used to feed all outputs into the next command, enabling you to call a piece of data or string from one command and put it straight into the next command. This works with grep and other core Linux tools.

_026 Delete directories

Using rm on a directory with objects within it won’t work, as it needs to also delete the files inside. You can modify the rm command to delete everything within a directory recursively using:

$ rm -r directory

_027 Shutdown command

You can shut down the system from the terminal using the shutdown command, the halt option (-h), which stops all running programs at the same time, and specifying a time of now so it turns off immediately rather than in 60 seconds:

$ sudo shutdown -h now

_028 Display all file information

As well as merely listing the files, we can use ls to list all of the information relating to each file, such as last date modified, permissions and more. Do this by using:

$ ls -l

_029 Reboot from command line

Back in the day, rebooting required a slightly more complex shutdown command: shutdown -r. In recent years it’s been replaced with a very simple:

$ sudo reboot

_030 Timed shutdown

The timing function of the shutdown command can be very useful if you need to wait for a program or cron job to finish before the shutdown occurs. You can use the time to do a normal halt/shutdown, or even with -r for a reboot after ten minutes, with something like:

$ sudo shutdown -h +10

_031 Log out

Logging out from the x session is generally advisable from the desktop environment, but if you need to log back out to the login manager, you can do this by restarting the display manager. In the case of many Linux distros, you use the command below:

$ sudo service lightdm restart

_032 Debian: update repositories

Debian-based (and Ubuntu-based) distros use apt-get as the command line package manager. One of the quirks of apt-get as a package manager is that before upgrading or installing software, it does not check to see if there’s a newer version in the repositories. Before doing any installation in Debian, use:

$ sudo apt-get update

_033 Debian: install software

Unlike a graphical package manager or software centre, you can’t quite search for the kind of packages you want to install, so you need to know the package name before installing. Once you do though, try:

$ sudo apt-get install package

_034 Debian: update software

You can upgrade the software in Debian from the terminal by first performing the repository update command in Tip 32, followed by the upgrade command below:

$ sudo apt-get upgrade

_035 Debian: uninstall software

As part of package management, apt-get enables you to uninstall software as well. This is simply done by replacing install with remove in the same command that you would use to install said package (Tip 33). You can also use purge instead of remove if you want to delete any config files along with it.

_036 Debian: upgrade distro

Debian systems can often update to a ‘newer version’, especially when it’s rolling or if there’s a new Ubuntu. Sometimes the prompt won’t show up, so you can do it in the terminal with:

$ sudo apt-get dist-upgrade

_037 Debian: multiple packages

A very simple thing you can do while installing on all platforms is list multiple packages to install at once with the normal installation command. So in Debian it would be:

$ sudo apt-get install package1 package2 package3

_038 Debian: dependencies

Compiling differs between software and they’ll each have a guide on how to go about it. One problem you might face is that it will stop until you can find and install the right dependency. You can get around this by installing auto-apt and then using it during configuration with:

$ sudo auto-apt run ./configure

_039 Debian: force install

Sometimes when installing software, apt-get will refuse to install if specific requirements aren’t met (usually in terms of other packages needing to be installed for the software to work properly). You can force the package to install even without the dependencies using:

$ sudo apt-get download package $ sudo dpkg -i package

_040 Debian: install binary

In Tip 39, we used dpkg -i to install the binary installer package that we downloaded from the repositories. This same command can be used to install any downloaded binary, either from the repos or from a website.

_041 Debian: manual force install package

If the advice in Tip 39 is still not working, you can force install with dpkg. To do this you just need to add the option –force-all to the installation command to ignore any problems, like so:

$ sudo dpkg --force-all -i package

_042 Red Hat: update software

Unlike apt-get, the yum package manager for Red Hat/Fedora-based distros does not need you to specifically update the repositories. You can merely update all the software using:

$ sudo yum update

_043 Red Hat: install software

Installing with yum is very simple, as long as you know the package name. Yum does have some search facilities though, if you really need to look it up, but once you know what package you want, use the following command:

$ sudo yum install package

_044 Red Hat: uninstall software

Yum can also be used to uninstall any package you have on your system, whether you installed it directly from yum or not. As long as you know the package name you can uninstall with:

$ sudo yum remove package

_045 Red Hat: force install

The force install function on Red Hat and Fedora-based Linux distros requires that you have a package downloaded and ready to install. You can download things with yum and then force the install with:

$ sudo yum install --downloadonly --downloaddir=[directory] package
$ sudo rpm -ivh --force package

_046 Red Hat: manual install

RPM is one of the package installers on Red Hat distros and can be used to install downloaded packages. You can either do something like in Tip 45 and download the package from the repos, or download it from the Internet and install with:

$ sudo rpm -i package

_047 Red Hat: force manual installation

As in Tip 45, you can use RPM to force install packages if there’s a dependency issue or something else wrong with any other packages that you have downloaded. The same command should be used as in Tip 45, with the -ivh and –force options present.

_048 Fedora: distro upgrade

Yum has its own distribution upgrade command, but only in Fedora and they prefer you not to use it unless you have to. Nonetheless, you can use the fedora-upgrade package in yum with:

$ sudo yum install fedora-upgrade

_049 Search a file for a term in files

The basic use of grep is to search through a file for a specific term. It will print out every line with that term in, so it’s best to use it with system files with readable content in. Use it with:

$ grep hello file

_050 Check for lines

Looking for specific lines in a file is all well and good, but when you then start to hunt them down and you realise the file is hundreds of lines long, you can save yourself a lot of time by getting grep to also print out the line number. You can do this with the -n option:

$ grep -n hello file

_051 Regular expressions

If you need to make a more advanced search with grep, you can use regular expressions. You can replace the search term with ^hello to look for lines that start with hello, or hello$ for lines ending in hello.

$ grep -n ^hello file
$ grep -n hello$ file

_052 Wildcards and grep

When searching for lines, you can use a wildcard if you need to look for similar terms. This is done by using a full stop in the search string – each full stop represents one wildcard character. Searching for h…o will return any five-letter string with h at the start of the string and o at the end. Use it like so:

$ grep ‘\<h...o>’ file</h...o>

_053 More wildcards

You’ll also be using wildcards to find something ending or beginning with a specific string but with no fixed length. You can do this in grep by using an asterisk (*) along with the dot.

In the above example, we would have used h.*o instead.

_054 Stop a system service

A system service is the kind of background software that launches at start up. These are controlled by the system management daemons like init or systemd, and can be controlled from the terminal with the service command. First, you can stop a service using:

$ sudo service name stop

_055 Start a service

You can start system services that have been stopped by using the same service command with a different operator. As long as you know the service name, start it using:

$ sudo service name start

_056 Restart a system service

This one is popular with setting up web servers that use Apache, for which restarts may be needed, along with other services that you customise along the way. Instead of running both the stop and start commands sequentially, you can instead restart services by using:

$ sudo service name restart

_057 Know the ID

The ID that you can get for the software with top can be used to manipulate it, but a common reason to know the ID is so that you can end the process if it’s having trouble stopping itself or using too many resources. With the ID in hand, you can kill it with:

$ ps auxw | grep apache
www-data 11671 0.1 3.1 675756 127036 ? S 06:25 0:15 /usr/sbin/apache2 -k start
$ kill 11671

_058 Kill multiple IDs

Sometimes a process can be split up over multiple IDs (this is usually the case with a web browser – Google Chrome is a notorious example), and you need to kill multiple processes at once. You can either try and quickly kill all the IDs, or use the common name for the process and kill it with:

$ killall -v process

_060 List USB devices
You may need to know which USB and USB- related devices are connected to a system, or find our their proper designation. You’ll need to install it first, but once you have, you can use lsusb in the terminal to list all of the available devices.

$ lsusb

_061 List hard drives and partitions

Whether you need to check the designation of certain drives for working on them, or you just need to get a general understanding of the system’s layout, you can use fdisk to list all the hard drives. Do this in the terminal with:

$ sudo fdisk -l

_062 Check running software

Sometimes you’ll want to check what’s running on your system and from the terminal this can be done simply with top. It lists all the relevant information you’ll need on your currently running software, such as CPU and memory usage, along with the ID so you can control it.

$ top

_063 Unpack a ZIP file

If you’ve downloaded a ZIP file and you did it from the terminal or you’re working from it, you can to unpack it using the unzip command. Use it like so:

$ unzip file.zip

_064 Unpack a TAR file

Sometimes Linux will have a compressed file that is archived as a .tar.gz, or a tarball. You can use the terminal to unpack these or similar TAR files using the tar command, although you need the right options. For the common .gz, it’s:

$ tar -zxvf file.tar.gz

_065 Copy and write disks

Coming from UNIX is a powerful image tool called dd, which I’ve been using a lot recently for writing Raspberry Pi SD cards. Use it to create images from discs and hard drives, and for writing them back. The if is the input file or drive and the of is the output file of the same. It works like so:

$ dd if=image.img of=/dev/sda bs=1M

_066 Create an empty file

Sometimes when coding or installing new software, you need a file to write to. You could create it manually with nano and then save it, but the terminal has a command similar to mkdir that enables you to create an empty file – this is touch:

$ touch file

_067 Print into the terminal

The terminal uses echo to print details from files into the terminal, much like the C language. If you’re writing a Bash script and want to see the output of the current section, you can use echo to print out the relevant info straight into the terminal output.

_068 Check an MD5 hash

When downloading certain files it can help a lot to check to make sure it’s downloaded properly. A lot of sites will offer the ability to check the integrity of the downloaded file by comparing a hash sum based on it. With that MD5 and the file location at hand, you can compare it with:

$ md5sum file

_069 Run commands to X display
Sometimes you need to do something concerning the X display, but the only way you can enter the command line is by switching to an alternate instance with Ctrl+Alt+F2 or similar. To send a command to the main X display, preface it with DISPLAY=“:0” so it knows where to go.

_070 Create a new SSH key

When you need to generate a strong encryption key, you can always have a go at creating it in the terminal. You can do this using your email address as identification by entering the following into the terminal:

$ ssh-keygen -t rsa -C “[email protected]

_071 System details

Sometimes you want to check what you’re running and you can do this with the simple uname command, which you can use in the terminal with the following:

$ uname

_072 Kernel version

As part of uname, you also get the kernel version. Knowing this can be useful for downloading the right header files when compiling modules or updating certain aspects. You can get purely the kernel version by adding the -r option:

$ uname -r

_073 CPU architecture

If you’re on an unknown machine, you might need to find out what kind of architecture you’re running. Find out what the processor is with:

$ uname -p

_074 Everything else

Uname enables you to display a lot of data that is available from the system and you can look at all of this information by simply using the -a option with the command:

$ uname -a

_075 Ubuntu version

With all the distro updates you do, it can be tricky to keep track of which version of Ubuntu you are on. You can check by using:

$ lsb-release -a

_082 Open files in terminal

If you’ve got a file you can see in a graphical file manager, instead of opening a terminal and navigating to and then executing the file or script, you can usually run it directly from the terminal. To do this you usually just need to right-click and select a ‘run in terminal’ option.

_083 Find files in terminal

You can search for specific files throughout the filesystem by using the find command. You need to give find a location to search in and a parameter to search for. For simply searching for a file from root with a specific name you can use:

$ find / -name file

_084 Locate files in terminal

Similar to find is locate, a newer tool that works slightly differently to find. While find also has ways to search for files by age, size, owner and so on, locate only really uses the name to locate, however it can do it so much faster. Use it with:

$ locate file

_085 Current location

In the terminal you might find yourself a little lost in the filesystem, or want a quick and easy way to pipe your current location into something else. You can print out your current absolute location using pwd.

$ pwd

>_086 Move directories
When moving between directories you might want to be able to quickly return to the previous one that you were using. Instead of using cd for moving to the directory, you can instead use pushd to move and create a ‘stack’ of directories to move between.

$ pushd

_087 Move back through pushd

Following on from Tip 86, once you want to start moving back up the stack to the first directory, you can use popd in the terminal. You can also check which directories you have stacked up by using the dirs command as well.

$ popd

_088 Process priorities
CPU priority for processes can be seen as running from -20 for highest priority or +20 for lowest. Putting “nice -n X” in front of any command enables you to change the priority from 0 to whatever number X is standing in for. Only sudo or root can elevate the priority of a process, but anyone can set one down the priority chain.

_089 Download via the terminal

If you need to download something via the Internet in the terminal, you’ll want to use the wget command. It will take any URL you specify and download it directly into your current location in the terminal (as long as you have permission to do so). Use it like:

$ wget http://example.com/file.zip

090_Change image formats Instead of loading up an image editor like GIMP, you can actually change images in the terminal using the convert command. You can very simply use it to change the filetype or even change the size. Do it with:

$ convert image.jpg image.png

_091 Alter image settings
As well as convert there’s also mogrify, which is part of the same software package. You can use it scale, rotate and do more to an image more easily than with some of the convert commands. To resize you can use something like:

$ mogrify -resize 640x480! image.png

_092 Send message to displays

This is an old school prank that can actually have good uses when done right. Xmessage enables you to send a message prompt to an x display on the same system, as long as you know the display you want to send it to. Try:

$ DISPLAY=:0 xmessage -center “Hello World!”

_093 Rename files with cp

This is an extension of the way you can use cp – cp will copy a file and name it to whatever you want, so you can either copy it to another directory and give it a different name, or you can just copy it into the same folder with a different name and delete the original. This also works for renaming with mv.

_094 Manual screenshots

This is something we have to do during Raspberry Pi tutorials, but it works everywhere else. For GNOME-based desktops you can do it in the terminal by calling gnome-screenshot, in XFCE it’s xfce-screenshooter, in LXDE it’s scrot and so on. This will immediately take a screenshot of what you can see.

$ gnome-screenshot
$ xfce-screenshooter
$ scrot

_095 Delayed screenshots

With the functions above, you can add delay to set up a screenshot. This is useful if you want
to show the contents of a drop-down menu, for example. Screenshot tools allow you to delay by adding the -d option and number of seconds, eg:

$ scrot -d 5

_096 Do the locomotion

The story behind this is that apparently after finding a lot of people misspelling the list command ‘ls’ as ‘sl’, this joke program was made to particularly hammer home that they were indeed not using the correct command. It plays the animation of a steam locomotive; sl.

$ sl

_097 What does the cow say

A Debian trick as part of apt-get, moo is an easter egg wherein you type apt-get moo and a cow asks you if you have mooed today. If you have, in fact, not mooed today then you should immediately find a quiet room and let out a single, loud moo to appease the terminal cow.

$ apt-get moo

_098 $ ^^^

This little trick is great for quickly correcting typing mistakes. If you omit ^^, then will be removed from the previous command. By default, only the first occurrence of is replaced. To replace every occurrence, append “:&”. You can omit the trailing caret symbol, except when using “:&”.

$ grpe peter /etc/passwd
-bash: grpe: command not found
$ ^pe^ep


$ grep petermac /etc/passwd ; ls -ld /home/petermac
ls: cannot access /home/petermac: No such file or directory
$ ^petermac^peter_mac^:&

_099 Reference a Word of the Current Command and Reuse It

$ !#:N

The “!#” event designator represents the current command line, while the :N word designator represents a word on the command line. Word references are zero based, so the first word, which is almost always a command, is :0, the second word, or first argument to the command, is :1, etc.

$ mv Working-with-Files.pdf Chapter-18-!#:1

This translates to:

mv Working-with-Files.pdf Chapter-18-Working-with-Files.pdf

_100 Save a Copy of Your Command Line Session

$ script

—then do stuff and finally,

$ exit

_101 Use Vim to Edit Files over the Network

$ vim scp://remote-user@remote-host//path/to/file

_102 Display Your Command Search Path in a Human Readable Format

$ echo $PATH | tr ':' '\n’

Show Open Network Connections

$ sudo lsof -Pni

The lsof command can not only be used to display open files, but open network ports, and network connections.

The -P option prevents the conversion of port numbers to port names.

The -n option prevents the conversion of IP addresses to host names.

The -i option tells lsof to display network connections.

Securing PostgreSQL

Install PostgreSQL

sudo apt-get update
sudo apt-get install postgresql postgresql-contrib

Upon installation, Postgres creates a Linux user called “postgres” which can be used to access the system. We can change to this user by typing:

sudo su - postgres

From here, we can connect to the system by typing:


Notice how we can connect without a password. This is because Postgres has authenticated by username, which it assumes is secured.

Exit out of PostgreSQL and the postgres user by typing the following:


Disable Remote Connections

sudo vi /etc/postgresql/9.1/main/pg_hba.conf

Ensure the contents of the config file look something like the following:

local   all             postgres                                peer
local   all             all                                     peer
host    all             all               md5
host    all             all             ::1/128                 md5

The first two security lines specify “local” as the scope that they apply to. This means they are using Unix/Linux domain sockets.
The second two declarations are remote, but the hosts that they apply to ( and ::1/128) are interfaces that specify the local machine.

What If You Need To Access the Databases Remotely?

To access PostgreSQL from a remote location, consider using SSH to connect to the database machine and then using a local connection to the database from there.
It is also possible to tunnel access to PostgreSQL through SSH so that the client machine can connect to the remote database as if it were local. Visit the PostgreSQL official documentation pages.

Use roles to lock down access to individual databases

Log into PostgreSQL:

sudo su - postgres

To create a new role, type the following:

CREATE ROLE role_name WITH optional_permissions;

To see the permissions you can assign, type:


You can alter the permissions of any role by typing:

ALTER ROLE role_name WITH optional_permissions;

List the current roles and their attributes by typing:

List of roles
Role name |                   Attributes                   | Member of
hello     | Create DB                                      | {}
postgres  | Superuser, Create role, Create DB, Replication | {}
testuser  |                                                | {}

Create a new user and assign appropriate permissions for every new application that will be using PostgreSQL.

Renewing letsencrypt certs for Apache

To renew as standalone and restart apache

sudo service apache2 stop
sudo certbot certonly –standalone

Saving debug log to /home/peter/appdir/~/.certbot/logs/letsencrypt.log
Plugins selected: Authenticator standalone, Installer None
Please enter in your domain name(s) (comma and/or space separated) (Enter 'c'
to cancel): staging.spxtrader.com.au
Obtaining a new certificate
Performing the following challenges:
http-01 challenge for staging.spxtrader.com.au
Waiting for verification...
Cleaning up challenges
Non-standard path(s), might not work with crontab installed by your operating system package manager
- Congratulations! Your certificate and chain have been saved at:
Your key file has been saved at:
Your cert will expire on 2020-01-04. To obtain a new or tweaked
version of this certificate in the future, simply run certbot
again. To non-interactively renew *all* of your certificates, run
"certbot renew"
- If you like Certbot, please consider supporting our work by:

Donating to ISRG / Let's Encrypt: https://letsencrypt.org/donate
Donating to EFF: https://eff.org/donate-le

Then update your /etc/apache2/sites-available/vhosts-ssl.conf

sudo service apache2 start

TMUX Operations

Tmux cheat sheet

(C-x means ctrl+x, M-x means alt+x)
Prefix key

The default prefix is C-b. If you (or your muscle memory) prefer C-a, you need to add this to ~/.tmux.conf:

Remap prefix to Control + a

set -g prefix C-a

Bind ‘C-a C-a’ to type ‘C-a’

bind C-a send-prefix
unbind C-b

I’m going to assume that C-a is your prefix.

Sessions, Windows and Panes
Session is a set of windows, plus a notion of which window is current.
Window is a single screen covered with panes. (Once might compare it to a ‘virtual desktop’ or a ‘space’.)
Pane is a rectangular part of a window that runs a specific command, e.g. a shell.
Getting help

Display a list of keyboard shortcuts:

C-a ?

Navigate using Vim or Emacs shortcuts, depending on the value of mode-keys. Emacs is the default, and if you want Vim shortcuts for help and copy modes (e.g. j, k, C-u, C-d), add the following line to ~/.tmux.conf:

setw -g mode-keys vi

Any command mentioned in this list can be executed as tmux something or C-a :something (or added to ~/.tmux.conf).
Managing sessions

Creating a session:

tmux new -s work

Create a new session that shares all windows with an existing session, but has its own separate notion of which window is current:

tmux new-session -s work2 -t work

Attach to a session:

tmux attach -t work

Detach from a session:

C-a d.

Switch between sessions:

C-a (          previous session
C-a )          next session
C-a L          ‘last’ (previously used) session
C-a s          choose a session from a list


C-a $          rename the current session

Managing windows

Create a window:

C-a c          create a new window

Switch between windows:

C-a 1 ...      switch to window 1, ..., 9, 0
C-a 9
C-a 0
C-a p          previous window
C-a n          next window
C-a l          ‘last’ (previously used) window
C-a w          choose window from a list

Switch between windows with a twist:

C-a M-n        next window with a bell, activity or
content alert
C-a M-p        previous such window


C-a ,          rename the current window
C-a &          kill the current window

Managing split panes

Creating a new pane by splitting an existing one:

C-a "          split vertically (top/bottom)
C-a %          split horizontally (left/right)

Switching between panes:

C-a left       go to the next pane on the left
C-a right      (or one of these other directions)
C-a up
C-a down
C-a o          go to the next pane (cycle through all of them)
C-a ;          go to the ‘last’ (previously used) pane

Moving panes around:

C-a {          move the current pane to the previous position
C-a }          move the current pane to the next position
C-a C-o        rotate window ‘up’ (i.e. move all panes)
C-a M-o        rotate window ‘down’
C-a !          move the current pane into a new separate
window (‘break pane’)
C-a :move-pane -t :3.2
split window 3's pane 2 and move the current pane there

Resizing panes:

C-a M-up, C-a M-down, C-a M-left, C-a M-right
resize by 5 rows/columns
C-a C-up, C-a C-down, C-a C-left, C-a C-right
resize by 1 row/column

Applying predefined layouts:

C-a M-1        switch to even-horizontal layout
C-a M-2        switch to even-vertical layout
C-a M-3        switch to main-horizontal layout
C-a M-4        switch to main-vertical layout
C-a M-5        switch to tiled layout
C-a space      switch to the next layout


C-a x          kill the current pane
C-a q          display pane numbers for a short while

Other config file settings

Force a reload of the config file on C-a r:

unbind r
bind r source-file ~/.tmux.conf

Some other settings that I use:

setw -g xterm-keys on

If you have vi style key bindings on then the following applies:
1) enter copy mode using Control+b [

2) navigate to beginning of text, you want to select and hit Space

3) move around using arrow keys to select region

4) when you reach end of region simply hit Enter to copy the region

5) now Control+b ] will paste the selection
To enable vi like cursor movement in copy mode put the following in your ~/.tmux.conf:

set-window-option -g mode-keys vi

more over what ever you copy, you may dump that out in your terminal using

tmux show-buffer

and even save to a file(say, foo.txt) using

tmux save-buffer foo.txt

To see all the paste buffers try Control + b #. To dump out the varios buffers on to the terminal or file you may use

tmux list-buffers
tmux show-buffer -b n
tmux save-buffer -b n foo.txt

where n is the index of the paste buffer.

Home Automation with Nue hub

Experience with combination of the 3ASmartHome ‘Nue’ and Philips Hue components so far.

The Goal

To automate light operations around the house with integration between Alexa, IFTTT and general schedules.

I have replaced old fashioned halogen lights with LEDs over time and didn’t want to blow that investment (approx 30 downlight at $30-$40 per light) so I was looking for a smart light switch solution instead. I also knew a few automations would be motion driven so integration with motion sensors would also be ideal.

Below is an account of my experience with the 3asmarthome switches and assorted components available from 3asmarthome.com.au based in Cheltenham in Melbourne, Australia.

Smart Switches

The 3a smart switches look fairly smart and modern. The build is pretty good with only concern being a slight movement on one or two units when installed due to the outer casing being a tiny bit too large. The switches are good for controlling lights through Alexa voice activation. Detection of devices in the Hui app is good – but control is limited to rudimentary ‘scene’ conditions. The ability to program the switches does not exist (no openhab/hassio integration). This is disappointing given the use of the Zigbee 3.0 protocol implies the devices should be identifiable and controllable.

The installation of switches will only work where a neutral (black in AU) wire is available at the switch. A number of 2-way and 3-way scenes in my house dont have such available. This is a pain as it results in new (smart) switches being mixed with old. There is currently no way to provide a dumb switch option (ie. with same look and feel as smart ones) to install for consistency around the house.

The lack of a switch option in these cases means the use of a smart light bulb is the only option. So I end up with a dumb light switch controlling a smart light bulb…there is an upside with smart lights as detailed below.
On the downside, a dumb switch controlling a smart light means the switch must be left on if you want any scenes/automations to work. If you switch the light off, this obviously cuts power to the light and you’re back to an ‘off’ smart light bulb. This is not a good place to be.

Smart Light bulbs

Effective and cheaper than the Philips ones. Multi colour, identifiable and controllable by the Philips Hue App (and Alexa). The light status in the Hue app has to be set as ‘always on’ to allow remote programming/control through scenes. They fit the existing downlight holes so no extra cutting is required (which I’ve read about with the Philips downlights).  I’m thinking of swapping out some lights (controlled by smart switches) to be smart bulbs to provide more control (office, laundry, cellar). Light quality is fairly good, but I’ve nothing to compare it against. On the downside (referred to above)…you hav to leave the light switch on and control them via app/voice otherwise your smart bulb becomes dumb very quickly.

Motion Sensors

Very finicky to set up – multiple attempts failed, then as if by magic (and with support from the 3aSmarthome guys everything worked but I’m still not sure what I changed – the Hui app is extremely frustrating in its simplicity and lack of intuitive controls.Once a motion sensor is set to trigger based on motion, it cant be set to trigger by motion within a timeframe <TODO>. This results in lights coming on at all times during the day? I’m sure I need to go back to the dreaming board with this…it just cant be so!!

I also purchased a few Philips Hue motion sensors given the trials I had with the 3A ones. The outcome is that they can only be used to control other Philips Hue devices and the 3A smart lights. They can’t be used to control the 3a light switches…this is a bummer given the number of switches I’ve installed. Upshot is install smart lights where you want motion control (via Philips Hue) and switches where voice control is preferred.

Infrared Control

Depends on a specific ‘Smart Life’ app. Once installed, and configured it’s very versatile in terms of providing an absolute tonne of devices to control. I had it controlling my 10+ year old aircon within a few minutes. I set up a ‘false’ condition to turn on the air con when the temperature dropped below 21C….the ambient temperature was about 17C…next thing the aircon is humming away.